Documentation
¶
Overview ¶
Package ai contains the model for LLM Function Calling features
Index ¶
Constants ¶
const FunctionDefinitionKey = "function-definition"
FunctionDefinitionKey is the yomo metadata key for function definition
Variables ¶
var ReducerTag uint32 = 0xE001
ReducerTag is the observed tag of the reducer
Functions ¶
This section is empty.
Types ¶
type ChainMessage ¶ added in v1.18.5
type ChainMessage struct { // PrecedingAssistantMessage is the preceding assistant message in llm response PreceedingAssistantMessage interface{} // ToolMessages is the tool messages aggragated from reducer-sfn by AI service ToolMessages []ToolMessage }
ChainMessage is the message for chaining llm request with preceeding `tool_calls` response
type ErrorResponse ¶ added in v1.18.5
type ErrorResponse struct {
Error string `json:"error"`
}
ErrorResponse is the response for error
type FunctionCall ¶
type FunctionCall struct { // TransID is the transaction id of the function calling chain, it is used for // multi-turn llm request. TransID string `json:"tid,omitempty"` // ReqID is the request id of the current function calling chain. Because multiple // function calling invokes may be occurred in the same request chain. ReqID string `json:"req_id,omitempty"` // Result is the struct result of the function calling. Result string `json:"result,omitempty"` // Arguments is the arguments of the function calling. This should be kept in this // context for next llm request in multi-turn request scenario. Arguments string `json:"arguments"` // ctx is the serverless context used in sfn. ToolCallID string `json:"tool_call_id,omitempty"` // FunctionName is the name of the function FunctionName string `json:"function_name,omitempty"` // IsOK is the flag to indicate the function calling is ok or not IsOK bool `json:"is_ok"` }
FunctionCall describes the data structure when invoking the sfn function
func (*FunctionCall) Bytes ¶
func (fco *FunctionCall) Bytes() ([]byte, error)
Bytes serialize the []byte of FunctionCallObject
func (*FunctionCall) FromBytes ¶
func (fco *FunctionCall) FromBytes(b []byte) error
FromBytes deserialize the FunctionCall object from the given []byte
type FunctionDefinition ¶
type FunctionDefinition = openai.FunctionDefinition
FunctionDefinition is the function definition
type FunctionParameters ¶
type FunctionParameters struct { Type string `json:"type"` Properties map[string]*ParameterProperty `json:"properties"` Required []string `json:"required,omitempty"` }
FunctionParameters defines the parameters the functions accepts. from API doc: "The parameters the functions accepts, described as a JSON Schema object. See the [guide](/docs/guides/gpt/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format."
type InvokeRequest ¶
type InvokeRequest struct { Prompt string `json:"prompt"` // Prompt is user input text for chat completion IncludeCallStack bool `json:"include_call_stack"` // IncludeCallStack is the flag to include call stack in response }
InvokeRequest is the request from user to BasicAPIServer
type InvokeResponse ¶
type InvokeResponse struct { // Content is the content from llm api response Content string `json:"content,omitempty"` // ToolCalls is the toolCalls from llm api response ToolCalls map[uint32][]*openai.ToolCall `json:"tool_calls,omitempty"` // ToolMessages is the tool messages from llm api response ToolMessages []ToolMessage `json:"tool_messages,omitempty"` // FinishReason is the finish reason from llm api response FinishReason string `json:"finish_reason,omitempty"` // TokenUsage is the token usage from llm api response TokenUsage TokenUsage `json:"token_usage,omitempty"` // AssistantMessage is the assistant message from llm api response, only present when finish reason is "tool_calls" AssistantMessage interface{} `json:"assistant_message,omitempty"` }
InvokeResponse is the response for chat completions
func ConvertToInvokeResponse ¶ added in v1.18.7
func ConvertToInvokeResponse(res *openai.ChatCompletionResponse, tcs map[uint32]openai.Tool) (*InvokeResponse, error)
ConvertToInvokeResponse converts openai.ChatCompletionResponse struct to InvokeResponse struct.
type OverviewResponse ¶
type OverviewResponse struct {
Functions map[uint32]*openai.FunctionDefinition // key is the tag of yomo
}
OverviewResponse is the response for overview
type ParameterProperty ¶
type ParameterProperty struct { Type string `json:"type"` Description string `json:"description"` Enum []any `json:"enum,omitempty"` }
ParameterProperty defines the property of the parameter
type TokenUsage ¶ added in v1.18.5
type TokenUsage struct { PromptTokens int `json:"prompt_tokens"` CompletionTokens int `json:"completion_tokens"` }
TokenUsage is the token usage in Response
type ToolMessage ¶ added in v1.18.5
type ToolMessage struct { Role string `json:"role"` Content string `json:"content"` ToolCallID string `json:"tool_call_id"` }
ToolMessage used for OpenAI tool message