Documentation ¶
Index ¶
Constants ¶
const ( GrokChatCompletionsURL = "https://api.x.ai/v1/chat/completions" Grok2LatestModel = "grok-2-latest" )
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Choice ¶
type Choice struct { Message struct { Content string `json:"content"` } `json:"message"` }
Choice represents a choice in the response.
type GrokPrompter ¶
GrokPrompter is a prompter for the Grok LLM.
func NewGrokPrompter ¶ added in v1.38.0
func NewGrokPrompter(apiKey, url, model string) *GrokPrompter
NewGrokPrompter returns a new GrokPrompter instance.
func (*GrokPrompter) Prompt ¶
func (p *GrokPrompter) Prompt(ctx context.Context, opts PrompterOpts) (string, error)
Prompt sends a prompt to the LLM and retrieves the response.
Use contextual messages to provide context to the LLM.
Additional options are not utilized by this implementation.
type JSONSchemaFormat ¶
type JSONSchemaFormat struct { Name string `json:"name"` Schema map[string]interface{} `json:"schema"` Strict bool `json:"strict"` }
JSONSchemaFormat represents the JSON schema format customization of the response.
type MessageRole ¶
type MessageRole = string
MessageRole represents the role of a message.
const ( // MessageRoleSystem is the role for the system message. MessageRoleSystem MessageRole = "system" // MessageRoleUser is the role for the user message. MessageRoleUser MessageRole = "user" // MessageRoleAssistant is the role for the assistant message. MessageRoleAssistant MessageRole = "assistant" )
type PromptRequest ¶
type PromptRequest struct { Model string `json:"model"` Messages []Message `json:"messages"` Stream bool `json:"stream,omitempty"` Temperature float64 `json:"temperature,omitempty"` ResponseFormat *ResponseFormat `json:"response_format,omitempty"` }
PromptRequest represents the payload for the LLM API request.
type PromptResponse ¶
type PromptResponse struct { Choices []struct { Index int `json:"index"` Message struct { Role string `json:"role"` Content string `json:"content"` Refusal string `json:"refusal,omitempty"` } `json:"message"` FinishReason string `json:"finish_reason"` } `json:"choices"` }
PromptResponse represents the response from the Grok LLM API.
type PrompterOpts ¶
type PrompterOpts struct { SystemPrompt string // The system prompt for the LLM. UserPrompt string // The user prompt for the LLM. Context []Message // The context for the LLM. Temperature float64 // The temperature for the LLM. ResponseFormat *ResponseFormat // The response format for the LLM. AdditionalOpts map[string]interface{} // Additional options for the LLM to be implemented by prompters, optionally. }
PrompterOpts represents the options for the Prompter.
type ResponseFormat ¶
type ResponseFormat struct { Type string `json:"type"` JSONSchema *JSONSchemaFormat `json:"json_schema,omitempty"` }
ResponseFormat represents the response format customization.