Documentation
¶
Index ¶
- func WithBaseURL(baseURL string) core.ClientOption
- func WithClientName(clientName *string) core.ClientOption
- func WithHTTPClient(httpClient core.HTTPClient) core.ClientOption
- func WithHTTPHeader(httpHeader http.Header) core.ClientOption
- func WithToken(token string) core.ClientOption
- type Client
- func (c *Client) Chat(ctx context.Context, request *v2.ChatRequest) (*v2.NonStreamedChatResponse, error)
- func (c *Client) ChatStream(ctx context.Context, request *v2.ChatStreamRequest) (*core.Stream[v2.StreamedChatResponse], error)
- func (c *Client) Classify(ctx context.Context, request *v2.ClassifyRequest) (*v2.ClassifyResponse, error)
- func (c *Client) DetectLanguage(ctx context.Context, request *v2.DetectLanguageRequest) (*v2.DetectLanguageResponse, error)
- func (c *Client) Detokenize(ctx context.Context, request *v2.DetokenizeRequest) (*v2.DetokenizeResponse, error)
- func (c *Client) Embed(ctx context.Context, request *v2.EmbedRequest) (*v2.EmbedResponse, error)
- func (c *Client) Generate(ctx context.Context, request *v2.GenerateRequest) (*v2.Generation, error)
- func (c *Client) GenerateStream(ctx context.Context, request *v2.GenerateStreamRequest) (*core.Stream[v2.GenerateStreamedResponse], error)
- func (c *Client) Rerank(ctx context.Context, request *v2.RerankRequest) (*v2.RerankResponse, error)
- func (c *Client) Summarize(ctx context.Context, request *v2.SummarizeRequest) (*v2.SummarizeResponse, error)
- func (c *Client) Tokenize(ctx context.Context, request *v2.TokenizeRequest) (*v2.TokenizeResponse, error)
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func WithBaseURL ¶
func WithBaseURL(baseURL string) core.ClientOption
WithBaseURL sets the client's base URL, overriding the default environment, if any.
func WithClientName ¶ added in v2.5.0
func WithClientName(clientName *string) core.ClientOption
WithClientName sets the clientName header on every request.
The name of the project that is making the request.
func WithHTTPClient ¶
func WithHTTPClient(httpClient core.HTTPClient) core.ClientOption
WithHTTPClient uses the given HTTPClient to issue all HTTP requests.
func WithHTTPHeader ¶
func WithHTTPHeader(httpHeader http.Header) core.ClientOption
WithHTTPHeader adds the given http.Header to all requests issued by the client.
func WithToken ¶
func WithToken(token string) core.ClientOption
WithToken sets the 'Authorization: Bearer <token>' header on every request.
Types ¶
type Client ¶
type Client struct { Datasets *datasets.Client Connectors *connectors.Client EmbedJobs *embedjobs.Client // contains filtered or unexported fields }
func NewClient ¶
func NewClient(opts ...core.ClientOption) *Client
func (*Client) Chat ¶
func (c *Client) Chat(ctx context.Context, request *v2.ChatRequest) (*v2.NonStreamedChatResponse, error)
The `chat` endpoint allows users to have conversations with a Large Language Model (LLM) from Cohere. Users can send messages as part of a persisted conversation using the `conversation_id` parameter, or they can pass in their own conversation history using the `chat_history` parameter.
The endpoint features additional parameters such as connectors(https://docs.cohere.com/docs/connectors) and `documents` that enable conversations enriched by external knowledge. We call this ["Retrieval Augmented Generation"](https://docs.cohere.com/docs/retrieval-augmented-generation-rag), or "RAG". For a full breakdown of the Chat API endpoint, document and connector modes, and streaming (with code samples), see [this guide](https://docs.cohere.com/docs/cochat-beta).
func (*Client) ChatStream ¶
func (c *Client) ChatStream(ctx context.Context, request *v2.ChatStreamRequest) (*core.Stream[v2.StreamedChatResponse], error)
The `chat` endpoint allows users to have conversations with a Large Language Model (LLM) from Cohere. Users can send messages as part of a persisted conversation using the `conversation_id` parameter, or they can pass in their own conversation history using the `chat_history` parameter.
The endpoint features additional parameters such as connectors(https://docs.cohere.com/docs/connectors) and `documents` that enable conversations enriched by external knowledge. We call this ["Retrieval Augmented Generation"](https://docs.cohere.com/docs/retrieval-augmented-generation-rag), or "RAG". For a full breakdown of the Chat API endpoint, document and connector modes, and streaming (with code samples), see [this guide](https://docs.cohere.com/docs/cochat-beta).
func (*Client) Classify ¶
func (c *Client) Classify(ctx context.Context, request *v2.ClassifyRequest) (*v2.ClassifyResponse, error)
This endpoint makes a prediction about which label fits the specified text inputs best. To make a prediction, Classify uses the provided `examples` of text + label pairs as a reference. Note: [Fine-tuned models](https://docs.cohere.com/docs/classify-fine-tuning) trained on classification examples don't require the `examples` parameter to be passed in explicitly.
func (*Client) DetectLanguage ¶
func (c *Client) DetectLanguage(ctx context.Context, request *v2.DetectLanguageRequest) (*v2.DetectLanguageResponse, error)
This endpoint identifies which language each of the provided texts is written in.
func (*Client) Detokenize ¶
func (c *Client) Detokenize(ctx context.Context, request *v2.DetokenizeRequest) (*v2.DetokenizeResponse, error)
This endpoint takes tokens using byte-pair encoding and returns their text representation. To learn more about tokenization and byte pair encoding, see the tokens page.
func (*Client) Embed ¶
func (c *Client) Embed(ctx context.Context, request *v2.EmbedRequest) (*v2.EmbedResponse, error)
This endpoint returns text embeddings. An embedding is a list of floating point numbers that captures semantic information about the text that it represents.
Embeddings can be used to create text classifiers as well as empower semantic search. To learn more about embeddings, see the embedding page.
If you want to learn more how to use the embedding model, have a look at the [Semantic Search Guide](/docs/semantic-search).
func (*Client) Generate ¶
func (c *Client) Generate(ctx context.Context, request *v2.GenerateRequest) (*v2.Generation, error)
This endpoint generates realistic text conditioned on a given input.
func (*Client) GenerateStream ¶ added in v2.5.0
func (c *Client) GenerateStream(ctx context.Context, request *v2.GenerateStreamRequest) (*core.Stream[v2.GenerateStreamedResponse], error)
This endpoint generates realistic text conditioned on a given input.
func (*Client) Rerank ¶
func (c *Client) Rerank(ctx context.Context, request *v2.RerankRequest) (*v2.RerankResponse, error)
This endpoint takes in a query and a list of texts and produces an ordered array with each text assigned a relevance score.
func (*Client) Summarize ¶
func (c *Client) Summarize(ctx context.Context, request *v2.SummarizeRequest) (*v2.SummarizeResponse, error)
This endpoint generates a summary in English for a given text.
func (*Client) Tokenize ¶
func (c *Client) Tokenize(ctx context.Context, request *v2.TokenizeRequest) (*v2.TokenizeResponse, error)
This endpoint splits input text into smaller units called tokens using byte-pair encoding (BPE). To learn more about tokenization and byte pair encoding, see the tokens page.