Documentation ¶
Overview ¶
Performs the analysis process on a text and return the tokens breakdown of the text.
Index ¶
- Variables
- type Analyze
- func (r *Analyze) Analyzer(analyzer string) *Analyze
- func (r *Analyze) Attributes(attributes ...string) *Analyze
- func (r *Analyze) CharFilter(charfilters ...types.CharFilter) *Analyze
- func (r Analyze) Do(providedCtx context.Context) (*Response, error)
- func (r *Analyze) Explain(explain bool) *Analyze
- func (r *Analyze) Field(field string) *Analyze
- func (r *Analyze) Filter(filters ...types.TokenFilter) *Analyze
- func (r *Analyze) Header(key, value string) *Analyze
- func (r *Analyze) HttpRequest(ctx context.Context) (*http.Request, error)
- func (r *Analyze) Index(index string) *Analyze
- func (r *Analyze) Normalizer(normalizer string) *Analyze
- func (r Analyze) Perform(providedCtx context.Context) (*http.Response, error)
- func (r *Analyze) Raw(raw io.Reader) *Analyze
- func (r *Analyze) Request(req *Request) *Analyze
- func (r *Analyze) Text(texttoanalyzes ...string) *Analyze
- func (r *Analyze) Tokenizer(tokenizer types.Tokenizer) *Analyze
- type NewAnalyze
- type Request
- type Response
Constants ¶
This section is empty.
Variables ¶
var ErrBuildPath = errors.New("cannot build path, check for missing path parameters")
ErrBuildPath is returned in case of missing parameters within the build of the request.
Functions ¶
This section is empty.
Types ¶
type Analyze ¶
type Analyze struct {
// contains filtered or unexported fields
}
func New ¶
func New(tp elastictransport.Interface) *Analyze
Performs the analysis process on a text and return the tokens breakdown of the text.
https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-analyze.html
func (*Analyze) Analyzer ¶
Analyzer The name of the analyzer that should be applied to the provided `text`. This could be a built-in analyzer, or an analyzer that’s been configured in the index. API name: analyzer
func (*Analyze) Attributes ¶
Attributes Array of token attributes used to filter the output of the `explain` parameter. API name: attributes
func (*Analyze) CharFilter ¶
func (r *Analyze) CharFilter(charfilters ...types.CharFilter) *Analyze
CharFilter Array of character filters used to preprocess characters before the tokenizer. API name: char_filter
func (Analyze) Do ¶
Do runs the request through the transport, handle the response and returns a analyze.Response
func (*Analyze) Explain ¶
Explain If `true`, the response includes token attributes and additional details. API name: explain
func (*Analyze) Field ¶
Field Field used to derive the analyzer. To use this parameter, you must specify an index. If specified, the `analyzer` parameter overrides this value. API name: field
func (*Analyze) Filter ¶
func (r *Analyze) Filter(filters ...types.TokenFilter) *Analyze
Filter Array of token filters used to apply after the tokenizer. API name: filter
func (*Analyze) HttpRequest ¶
HttpRequest returns the http.Request object built from the given parameters.
func (*Analyze) Index ¶
Index Index used to derive the analyzer. If specified, the `analyzer` or field parameter overrides this value. If no index is specified or the index does not have a default analyzer, the analyze API uses the standard analyzer. API Name: index
func (*Analyze) Normalizer ¶
Normalizer Normalizer to use to convert text into a single token. API name: normalizer
func (Analyze) Perform ¶
Perform runs the http.Request through the provided transport and returns an http.Response.
func (*Analyze) Raw ¶
Raw takes a json payload as input which is then passed to the http.Request If specified Raw takes precedence on Request method.
type NewAnalyze ¶
type NewAnalyze func() *Analyze
NewAnalyze type alias for index.
func NewAnalyzeFunc ¶
func NewAnalyzeFunc(tp elastictransport.Interface) NewAnalyze
NewAnalyzeFunc returns a new instance of Analyze with the provided transport. Used in the index of the library this allows to retrieve every apis in once place.
type Request ¶
type Request struct { // Analyzer The name of the analyzer that should be applied to the provided `text`. // This could be a built-in analyzer, or an analyzer that’s been configured in // the index. Analyzer *string `json:"analyzer,omitempty"` // Attributes Array of token attributes used to filter the output of the `explain` // parameter. Attributes []string `json:"attributes,omitempty"` // CharFilter Array of character filters used to preprocess characters before the // tokenizer. CharFilter []types.CharFilter `json:"char_filter,omitempty"` // Explain If `true`, the response includes token attributes and additional details. Explain *bool `json:"explain,omitempty"` // Field Field used to derive the analyzer. // To use this parameter, you must specify an index. // If specified, the `analyzer` parameter overrides this value. Field *string `json:"field,omitempty"` // Filter Array of token filters used to apply after the tokenizer. Filter []types.TokenFilter `json:"filter,omitempty"` // Normalizer Normalizer to use to convert text into a single token. Normalizer *string `json:"normalizer,omitempty"` // Text Text to analyze. // If an array of strings is provided, it is analyzed as a multi-value field. Text []string `json:"text,omitempty"` // Tokenizer Tokenizer to use to convert text into tokens. Tokenizer types.Tokenizer `json:"tokenizer,omitempty"` }
Request holds the request body struct for the package analyze
func (*Request) UnmarshalJSON ¶
type Response ¶
type Response struct { Detail *types.AnalyzeDetail `json:"detail,omitempty"` Tokens []types.AnalyzeToken `json:"tokens,omitempty"` }
Response holds the response body struct for the package analyze