ai

package
v1.18.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 26, 2024 License: Apache-2.0 Imports: 16 Imported by: 1

Documentation

Overview

Package ai provide LLM Function Calling features

Index

Constants

View Source
const (
	// DefaultZipperAddr is the default endpoint of the zipper
	DefaultZipperAddr = "localhost:9000"
)

Variables

View Source
var (
	// ErrNotExistsProvider is the error when the provider does not exist
	ErrNotExistsProvider = errors.New("llm provider does not exist")
	// ErrNotImplementedService is the error when the service is not implemented
	ErrNotImplementedService = errors.New("llm service is not implemented")
	// ErrConfigNotFound is the error when the ai config was not found
	ErrConfigNotFound = errors.New("ai config was not found")
	// ErrConfigFormatError is the error when the ai config format is incorrect
	ErrConfigFormatError = errors.New("ai config format is incorrect")
)
View Source
var (
	// ServiceCacheSize is the size of the service cache
	ServiceCacheSize = 1024
	// ServiceCacheTTL is the time to live of the service cache
	ServiceCacheTTL = time.Minute * 0 // 30

)
View Source
var RequestTimeout = 5 * time.Second

RequestTimeout is the timeout for the request, default is 5 seconds

Functions

func ConnMiddleware

func ConnMiddleware(next core.ConnHandler) core.ConnHandler

ConnMiddleware returns a ConnMiddleware that can be used to intercept the connection.

func ListProviders

func ListProviders() []string

ListProviders returns the list of llm providers

func ListToolCalls

func ListToolCalls() (map[uint32]ai.ToolCall, error)

ListToolCalls lists the AI tool calls

func RegisterFunction

func RegisterFunction(tag uint32, functionDefinition []byte, connID uint64) error

RegisterFunction registers the tool function

func RegisterProvider

func RegisterProvider(provider LLMProvider)

RegisterProvider registers the llm provider

func Serve

func Serve(config *Config, zipperListenAddr string, credential string) error

Serve starts the Basic API Server

func SetDefaultProvider

func SetDefaultProvider(name string)

SetDefaultProvider sets the default llm provider

func UnregisterFunction

func UnregisterFunction(name string, connID uint64) error

UnregisterFunction unregister the tool function

Types

type BasicAPIServer

type BasicAPIServer struct {
	// Name is the name of the server
	Name string
	// Config is the configuration of the server
	*Config
	// ZipperAddr is the address of the zipper
	ZipperAddr string
	// Provider is the llm provider
	Provider LLMProvider
	// contains filtered or unexported fields
}

BasicAPIServer provides restful service for end user

func NewBasicAPIServer

func NewBasicAPIServer(name string, config *Config, zipperAddr string, provider LLMProvider, credential string) (*BasicAPIServer, error)

NewBasicAPIServer creates a new restful service

func (*BasicAPIServer) Serve

func (a *BasicAPIServer) Serve() error

Serve starts a RESTful service that provides a '/invoke' endpoint. Users submit questions to this endpoint. The service then generates a prompt based on the question and registered functions. It calls the completion api by llm provider to get the functions and arguments to be invoked. These functions are invoked sequentially by YoMo. all the functions write their results to the reducer-sfn.

type CacheItem

type CacheItem struct {
	ResponseWriter http.ResponseWriter
	// contains filtered or unexported fields
}

CacheItem cache the http.ResponseWriter, which is used for writing response from reducer. TODO: http.ResponseWriter is from the SimpleRestfulServer interface, should be decoupled from here.

type Config

type Config struct {
	Server    Server              `yaml:"server"`    // Server is the configuration of the BasicAPIServer
	Providers map[string]Provider `yaml:"providers"` // Providers is the configuration of llm provider
}

Config is the configuration of AI bridge. The configuration looks like: bridge:

ai:
	server:
		host: http://localhost
		port: 8000
		credential: token:<CREDENTIAL>
		provider: azopenai

	providers:
		azopenai:
			api_key:
			api_endpoint:

		openai:
			api_key:
			api_endpoint:

		huggingface:
			model:

func ParseConfig

func ParseConfig(conf map[string]any) (config *Config, err error)

ParseConfig parses the AI config from conf

type LLMProvider

type LLMProvider interface {
	// Name returns the name of the llm provider
	Name() string
	// GetOverview returns the overview of the AI functions, key is the tag, value is the function definition
	GetOverview() (*ai.OverviewResponse, error)
	// GetChatCompletions returns the chat completions
	GetChatCompletions(prompt string) (*ai.InvokeResponse, error)
	// RegisterFunction registers the llm function
	RegisterFunction(tag uint32, functionDefinition *ai.FunctionDefinition, connID uint64) error
	// UnregisterFunction unregister the llm function
	UnregisterFunction(name string, connID uint64) error
	// ListToolCalls lists the llm tool calls
	ListToolCalls() (map[uint32]ai.ToolCall, error)
}

LLMProvider provides an interface to the llm providers

func GetDefaultProvider

func GetDefaultProvider() (LLMProvider, error)

GetDefaultProvider returns the default llm provider

func GetProvider

func GetProvider(name string) LLMProvider

GetProvider returns the llm provider by name

func GetProviderAndSetDefault

func GetProviderAndSetDefault(name string) (LLMProvider, error)

GetProviderAndSetDefault returns the llm provider by name and set it as the default provider

type Provider

type Provider = map[string]string

Provider is the configuration of llm provider

type Server

type Server struct {
	Addr     string `yaml:"addr"`     // Addr is the address of the server
	Provider string `yaml:"provider"` // Provider is the llm provider to use
}

Server is the configuration of the BasicAPIServer, which is the endpoint for end user access

type Service

type Service struct {
	LLMProvider
	// contains filtered or unexported fields
}

Service is used to invoke LLM Provider to get the functions to be executed, then, use source to send arguments which returned by llm provider to target function. Finally, use reducer to aggregate all the results, and write the result by the http.ResponseWriter.

func NewService

func NewService(credential string, zipperAddr string, aiProvider LLMProvider) (*Service, error)

NewService creates a new AI service, if the service is already created, it will return the existing one

func (*Service) GetChatCompletions

func (s *Service) GetChatCompletions(prompt string) (*ai.InvokeResponse, error)

GetChatCompletions returns the llm api response

func (*Service) GetOverview

func (s *Service) GetOverview() (*ai.OverviewResponse, error)

GetOverview returns the overview of the AI functions, key is the tag, value is the function definition

func (*Service) Release

func (s *Service) Release()

Release releases the resources

func (*Service) Write

func (s *Service) Write(tag uint32, data []byte) error

Write writes the data to zipper

Directories

Path Synopsis
provider
azopenai
Package azopenai is used to provide the Azure OpenAI service
Package azopenai is used to provide the Azure OpenAI service

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL