Documentation
¶
Overview ¶
Package ai provide LLM Function Calling features
Index ¶
- Constants
- Variables
- func ConnMiddleware(next core.ConnHandler) core.ConnHandler
- func ListProviders() []string
- func ListToolCalls() (map[uint32]ai.ToolCall, error)
- func RegisterFunction(tag uint32, functionDefinition []byte, connID uint64) error
- func RegisterProvider(provider LLMProvider)
- func Serve(config *Config, zipperListenAddr string, credential string) error
- func SetDefaultProvider(name string)
- func UnregisterFunction(name string, connID uint64) error
- type BasicAPIServer
- type CacheItem
- type Config
- type LLMProvider
- type Provider
- type Server
- type Service
Constants ¶
const (
// DefaultZipperAddr is the default endpoint of the zipper
DefaultZipperAddr = "localhost:9000"
)
Variables ¶
var ( // ErrNotExistsProvider is the error when the provider does not exist ErrNotExistsProvider = errors.New("llm provider does not exist") // ErrNotImplementedService is the error when the service is not implemented ErrNotImplementedService = errors.New("llm service is not implemented") // ErrConfigNotFound is the error when the ai config was not found ErrConfigNotFound = errors.New("ai config was not found") // ErrConfigFormatError is the error when the ai config format is incorrect ErrConfigFormatError = errors.New("ai config format is incorrect") )
var ( // ServiceCacheSize is the size of the service cache ServiceCacheSize = 1024 // ServiceCacheTTL is the time to live of the service cache ServiceCacheTTL = time.Minute * 0 // 30 )
var RequestTimeout = 5 * time.Second
RequestTimeout is the timeout for the request, default is 5 seconds
Functions ¶
func ConnMiddleware ¶
func ConnMiddleware(next core.ConnHandler) core.ConnHandler
ConnMiddleware returns a ConnMiddleware that can be used to intercept the connection.
func ListToolCalls ¶
ListToolCalls lists the AI tool calls
func RegisterFunction ¶
RegisterFunction registers the tool function
func RegisterProvider ¶
func RegisterProvider(provider LLMProvider)
RegisterProvider registers the llm provider
func SetDefaultProvider ¶
func SetDefaultProvider(name string)
SetDefaultProvider sets the default llm provider
func UnregisterFunction ¶
UnregisterFunction unregister the tool function
Types ¶
type BasicAPIServer ¶
type BasicAPIServer struct { // Name is the name of the server Name string // Config is the configuration of the server *Config // ZipperAddr is the address of the zipper ZipperAddr string // Provider is the llm provider Provider LLMProvider // contains filtered or unexported fields }
BasicAPIServer provides restful service for end user
func NewBasicAPIServer ¶
func NewBasicAPIServer(name string, config *Config, zipperAddr string, provider LLMProvider, credential string) (*BasicAPIServer, error)
NewBasicAPIServer creates a new restful service
func (*BasicAPIServer) Serve ¶
func (a *BasicAPIServer) Serve() error
Serve starts a RESTful service that provides a '/invoke' endpoint. Users submit questions to this endpoint. The service then generates a prompt based on the question and registered functions. It calls the completion api by llm provider to get the functions and arguments to be invoked. These functions are invoked sequentially by YoMo. all the functions write their results to the reducer-sfn.
type CacheItem ¶
type CacheItem struct { ResponseWriter http.ResponseWriter // contains filtered or unexported fields }
CacheItem cache the http.ResponseWriter, which is used for writing response from reducer. TODO: http.ResponseWriter is from the SimpleRestfulServer interface, should be decoupled from here.
type Config ¶
type Config struct { Server Server `yaml:"server"` // Server is the configuration of the BasicAPIServer Providers map[string]Provider `yaml:"providers"` // Providers is the configuration of llm provider }
Config is the configuration of AI bridge. The configuration looks like: bridge:
ai: server: host: http://localhost port: 8000 credential: token:<CREDENTIAL> provider: azopenai providers: azopenai: api_key: api_endpoint: openai: api_key: api_endpoint: huggingface: model:
type LLMProvider ¶
type LLMProvider interface { // Name returns the name of the llm provider Name() string // GetOverview returns the overview of the AI functions, key is the tag, value is the function definition GetOverview() (*ai.OverviewResponse, error) // GetChatCompletions returns the chat completions GetChatCompletions(prompt string) (*ai.InvokeResponse, error) // RegisterFunction registers the llm function RegisterFunction(tag uint32, functionDefinition *ai.FunctionDefinition, connID uint64) error // UnregisterFunction unregister the llm function UnregisterFunction(name string, connID uint64) error // ListToolCalls lists the llm tool calls ListToolCalls() (map[uint32]ai.ToolCall, error) }
LLMProvider provides an interface to the llm providers
func GetDefaultProvider ¶
func GetDefaultProvider() (LLMProvider, error)
GetDefaultProvider returns the default llm provider
func GetProvider ¶
func GetProvider(name string) LLMProvider
GetProvider returns the llm provider by name
func GetProviderAndSetDefault ¶
func GetProviderAndSetDefault(name string) (LLMProvider, error)
GetProviderAndSetDefault returns the llm provider by name and set it as the default provider
type Server ¶
type Server struct { Addr string `yaml:"addr"` // Addr is the address of the server Provider string `yaml:"provider"` // Provider is the llm provider to use }
Server is the configuration of the BasicAPIServer, which is the endpoint for end user access
type Service ¶
type Service struct { LLMProvider // contains filtered or unexported fields }
Service is used to invoke LLM Provider to get the functions to be executed, then, use source to send arguments which returned by llm provider to target function. Finally, use reducer to aggregate all the results, and write the result by the http.ResponseWriter.
func NewService ¶
func NewService(credential string, zipperAddr string, aiProvider LLMProvider) (*Service, error)
NewService creates a new AI service, if the service is already created, it will return the existing one
func (*Service) GetChatCompletions ¶
func (s *Service) GetChatCompletions(prompt string) (*ai.InvokeResponse, error)
GetChatCompletions returns the llm api response
func (*Service) GetOverview ¶
func (s *Service) GetOverview() (*ai.OverviewResponse, error)
GetOverview returns the overview of the AI functions, key is the tag, value is the function definition