Documentation
¶
Overview ¶
Package ai provide LLM Function Calling features
Index ¶
- Constants
- Variables
- func ConnMiddleware(next core.ConnHandler) core.ConnHandler
- func DefaultExchangeMetadataFunc(credential string) (metadata.M, error)
- func HandleChatCompletions(w http.ResponseWriter, r *http.Request)
- func HandleInvoke(w http.ResponseWriter, r *http.Request)
- func HandleOverview(w http.ResponseWriter, r *http.Request)
- func ListProviders() []string
- func RegisterProvider(provider LLMProvider)
- func RespondWithError(w http.ResponseWriter, code int, err error)
- func Serve(config *Config, zipperListenAddr string, credential string) error
- func SetDefaultProvider(name string)
- func WithContextService(handler http.Handler, credential string, zipperAddr string, ...) http.Handler
- func WithServiceContext(ctx context.Context, service *Service) context.Context
- type BasicAPIServer
- type Config
- type ExchangeMetadataFunc
- type LLMProvider
- type Provider
- type ResponseRecver
- type Server
- type Service
- func (s *Service) GetChatCompletions(ctx context.Context, req openai.ChatCompletionRequest, reqID string, ...) error
- func (s *Service) GetInvoke(ctx context.Context, userInstruction string, baseSystemMessage string, ...) (*ai.InvokeResponse, error)
- func (s *Service) GetOverview() (*ai.OverviewResponse, error)
- func (s *Service) Release()
- func (s *Service) SetSystemPrompt(prompt string)
- func (s *Service) Write(tag uint32, data []byte) error
Constants ¶
const (
// DefaultZipperAddr is the default endpoint of the zipper
DefaultZipperAddr = "localhost:9000"
)
const MetadataKey = "ai"
MetadataKey tells that the function is an ai function.
Variables ¶
var ( // ErrNotExistsProvider is the error when the provider does not exist ErrNotExistsProvider = errors.New("llm provider does not exist") // ErrConfigNotFound is the error when the ai config was not found ErrConfigNotFound = errors.New("ai config was not found") // ErrConfigFormatError is the error when the ai config format is incorrect ErrConfigFormatError = errors.New("ai config format is incorrect") )
var ( // ServiceCacheSize is the size of the service cache ServiceCacheSize = 1024 // ServiceCacheTTL is the time to live of the service cache ServiceCacheTTL = time.Minute * 0 // 30 )
var RequestTimeout = 5 * time.Second
RequestTimeout is the timeout for the request, default is 5 seconds
Functions ¶
func ConnMiddleware ¶
func ConnMiddleware(next core.ConnHandler) core.ConnHandler
ConnMiddleware returns a ConnMiddleware that can be used to intercept the connection.
func DefaultExchangeMetadataFunc ¶ added in v1.18.3
DefaultExchangeMetadataFunc is the default ExchangeMetadataFunc, It returns an empty metadata.
func HandleChatCompletions ¶ added in v1.18.7
func HandleChatCompletions(w http.ResponseWriter, r *http.Request)
HandleChatCompletions is the handler for POST /chat/completion
func HandleInvoke ¶ added in v1.18.3
func HandleInvoke(w http.ResponseWriter, r *http.Request)
HandleInvoke is the handler for POST /invoke
func HandleOverview ¶ added in v1.18.3
func HandleOverview(w http.ResponseWriter, r *http.Request)
HandleOverview is the handler for GET /overview
func RegisterProvider ¶
func RegisterProvider(provider LLMProvider)
RegisterProvider registers the llm provider
func RespondWithError ¶ added in v1.18.7
func RespondWithError(w http.ResponseWriter, code int, err error)
RespondWithError writes an error to response according to the OpenAI API spec.
func SetDefaultProvider ¶
func SetDefaultProvider(name string)
SetDefaultProvider sets the default llm provider
func WithContextService ¶ added in v1.18.3
func WithContextService(handler http.Handler, credential string, zipperAddr string, provider LLMProvider, exFn ExchangeMetadataFunc) http.Handler
WithContextService adds the service to the request context
Types ¶
type BasicAPIServer ¶
type BasicAPIServer struct { // Name is the name of the server Name string // Config is the configuration of the server *Config // ZipperAddr is the address of the zipper ZipperAddr string // Provider is the llm provider Provider LLMProvider // contains filtered or unexported fields }
BasicAPIServer provides restful service for end user
func NewBasicAPIServer ¶
func NewBasicAPIServer(name string, config *Config, zipperAddr string, provider LLMProvider, credential string) (*BasicAPIServer, error)
NewBasicAPIServer creates a new restful service
func (*BasicAPIServer) Serve ¶
func (a *BasicAPIServer) Serve() error
Serve starts a RESTful service that provides a '/invoke' endpoint. Users submit questions to this endpoint. The service then generates a prompt based on the question and registered functions. It calls the completion api by llm provider to get the functions and arguments to be invoked. These functions are invoked sequentially by YoMo. all the functions write their results to the reducer-sfn.
type Config ¶
type Config struct { Server Server `yaml:"server"` // Server is the configuration of the BasicAPIServer Providers map[string]Provider `yaml:"providers"` // Providers is the configuration of llm provider }
Config is the configuration of AI bridge. The configuration looks like:
bridge:
ai: server: host: http://localhost port: 8000 credential: token:<CREDENTIAL> provider: openai providers: azopenai: api_endpoint: https://<RESOURCE>.openai.azure.com deployment_id: <DEPLOYMENT_ID> api_key: <API_KEY> api_version: <API_VERSION> openai: api_key: api_endpoint: gemini: api_key: cloudflare_azure: endpoint: https://gateway.ai.cloudflare.com/v1/<CF_GATEWAY_ID>/<CF_GATEWAY_NAME> api_key: <AZURE_API_KEY> resource: <AZURE_OPENAI_RESOURCE> deployment_id: <AZURE_OPENAI_DEPLOYMENT_ID> api_version: <AZURE_OPENAI_API_VERSION>
type ExchangeMetadataFunc ¶ added in v1.18.3
ExchangeMetadataFunc is used to exchange metadata
type LLMProvider ¶
type LLMProvider interface { // Name returns the name of the llm provider Name() string // GetChatCompletions returns the chat completions. GetChatCompletions(context.Context, openai.ChatCompletionRequest, metadata.M) (openai.ChatCompletionResponse, error) // GetChatCompletionsStream returns the chat completions in stream. GetChatCompletionsStream(context.Context, openai.ChatCompletionRequest, metadata.M) (ResponseRecver, error) }
LLMProvider provides an interface to the llm providers
func GetDefaultProvider ¶
func GetDefaultProvider() (LLMProvider, error)
GetDefaultProvider returns the default llm provider
func GetProvider ¶
func GetProvider(name string) LLMProvider
GetProvider returns the llm provider by name
func GetProviderAndSetDefault ¶
func GetProviderAndSetDefault(name string) (LLMProvider, error)
GetProviderAndSetDefault returns the llm provider by name and set it as the default provider
type ResponseRecver ¶ added in v1.18.7
type ResponseRecver interface { // Recv is the receive function. Recv() (response openai.ChatCompletionStreamResponse, err error) }
ResponseRecver receives stream response.
type Server ¶
type Server struct { Addr string `yaml:"addr"` // Addr is the address of the server Provider string `yaml:"provider"` // Provider is the llm provider to use }
Server is the configuration of the BasicAPIServer, which is the endpoint for end user access
type Service ¶
type Service struct { Metadata metadata.M LLMProvider // contains filtered or unexported fields }
Service is used to invoke LLM Provider to get the functions to be executed, then, use source to send arguments which returned by llm provider to target function. Finally, use reducer to aggregate all the results, and write the result by the http.ResponseWriter.
func FromServiceContext ¶ added in v1.18.3
FromServiceContext returns the service from the request context
func LoadOrCreateService ¶ added in v1.18.3
func LoadOrCreateService(credential string, zipperAddr string, aiProvider LLMProvider, exFn ExchangeMetadataFunc) (*Service, error)
LoadOrCreateService loads or creates a new AI service, if the service is already created, it will return the existing one
func (*Service) GetChatCompletions ¶
func (s *Service) GetChatCompletions(ctx context.Context, req openai.ChatCompletionRequest, reqID string, w http.ResponseWriter, includeCallStack bool) error
GetChatCompletions returns the llm api response
func (*Service) GetInvoke ¶ added in v1.18.7
func (s *Service) GetInvoke(ctx context.Context, userInstruction string, baseSystemMessage string, reqID string, includeCallStack bool) (*ai.InvokeResponse, error)
GetInvoke returns the invoke response
func (*Service) GetOverview ¶
func (s *Service) GetOverview() (*ai.OverviewResponse, error)
GetOverview returns the overview of the AI functions, key is the tag, value is the function definition
func (*Service) SetSystemPrompt ¶ added in v1.18.7
SetSystemPrompt sets the system prompt
Directories
¶
Path | Synopsis |
---|---|
provider
|
|
azopenai
Package azopenai is used to provide the Azure OpenAI service
|
Package azopenai is used to provide the Azure OpenAI service |
cfazure
Package cfazure is used to provide the Azure OpenAI service
|
Package cfazure is used to provide the Azure OpenAI service |
cfopenai
Package cfopenai is used to provide the Azure OpenAI service
|
Package cfopenai is used to provide the Azure OpenAI service |
openai
Package openai is the OpenAI llm provider
|
Package openai is the OpenAI llm provider |
Package register provides a register for registering and unregistering functions
|
Package register provides a register for registering and unregistering functions |