ai

package
v1.18.7 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 6, 2024 License: Apache-2.0 Imports: 22 Imported by: 1

Documentation

Overview

Package ai provide LLM Function Calling features

Index

Constants

View Source
const (
	// DefaultZipperAddr is the default endpoint of the zipper
	DefaultZipperAddr = "localhost:9000"
)
View Source
const MetadataKey = "ai"

MetadataKey tells that the function is an ai function.

Variables

View Source
var (
	// ErrNotExistsProvider is the error when the provider does not exist
	ErrNotExistsProvider = errors.New("llm provider does not exist")
	// ErrConfigNotFound is the error when the ai config was not found
	ErrConfigNotFound = errors.New("ai config was not found")
	// ErrConfigFormatError is the error when the ai config format is incorrect
	ErrConfigFormatError = errors.New("ai config format is incorrect")
)
View Source
var (
	// ServiceCacheSize is the size of the service cache
	ServiceCacheSize = 1024
	// ServiceCacheTTL is the time to live of the service cache
	ServiceCacheTTL = time.Minute * 0 // 30

)
View Source
var RequestTimeout = 5 * time.Second

RequestTimeout is the timeout for the request, default is 5 seconds

Functions

func ConnMiddleware

func ConnMiddleware(next core.ConnHandler) core.ConnHandler

ConnMiddleware returns a ConnMiddleware that can be used to intercept the connection.

func DefaultExchangeMetadataFunc added in v1.18.3

func DefaultExchangeMetadataFunc(credential string) (metadata.M, error)

DefaultExchangeMetadataFunc is the default ExchangeMetadataFunc, It returns an empty metadata.

func HandleChatCompletions added in v1.18.7

func HandleChatCompletions(w http.ResponseWriter, r *http.Request)

HandleChatCompletions is the handler for POST /chat/completion

func HandleInvoke added in v1.18.3

func HandleInvoke(w http.ResponseWriter, r *http.Request)

HandleInvoke is the handler for POST /invoke

func HandleOverview added in v1.18.3

func HandleOverview(w http.ResponseWriter, r *http.Request)

HandleOverview is the handler for GET /overview

func ListProviders

func ListProviders() []string

ListProviders returns the list of llm providers

func RegisterProvider

func RegisterProvider(provider LLMProvider)

RegisterProvider registers the llm provider

func RespondWithError added in v1.18.7

func RespondWithError(w http.ResponseWriter, code int, err error)

RespondWithError writes an error to response according to the OpenAI API spec.

func Serve

func Serve(config *Config, zipperListenAddr string, credential string) error

Serve starts the Basic API Server

func SetDefaultProvider

func SetDefaultProvider(name string)

SetDefaultProvider sets the default llm provider

func WithContextService added in v1.18.3

func WithContextService(handler http.Handler, credential string, zipperAddr string, provider LLMProvider, exFn ExchangeMetadataFunc) http.Handler

WithContextService adds the service to the request context

func WithServiceContext added in v1.18.3

func WithServiceContext(ctx context.Context, service *Service) context.Context

WithServiceContext adds the service to the request context

Types

type BasicAPIServer

type BasicAPIServer struct {
	// Name is the name of the server
	Name string
	// Config is the configuration of the server
	*Config
	// ZipperAddr is the address of the zipper
	ZipperAddr string
	// Provider is the llm provider
	Provider LLMProvider
	// contains filtered or unexported fields
}

BasicAPIServer provides restful service for end user

func NewBasicAPIServer

func NewBasicAPIServer(name string, config *Config, zipperAddr string, provider LLMProvider, credential string) (*BasicAPIServer, error)

NewBasicAPIServer creates a new restful service

func (*BasicAPIServer) Serve

func (a *BasicAPIServer) Serve() error

Serve starts a RESTful service that provides a '/invoke' endpoint. Users submit questions to this endpoint. The service then generates a prompt based on the question and registered functions. It calls the completion api by llm provider to get the functions and arguments to be invoked. These functions are invoked sequentially by YoMo. all the functions write their results to the reducer-sfn.

type Config

type Config struct {
	Server    Server              `yaml:"server"`    // Server is the configuration of the BasicAPIServer
	Providers map[string]Provider `yaml:"providers"` // Providers is the configuration of llm provider
}

Config is the configuration of AI bridge. The configuration looks like:

bridge:

ai:
	server:
		host: http://localhost
		port: 8000
		credential: token:<CREDENTIAL>
		provider: openai
	providers:
		azopenai:
			api_endpoint: https://<RESOURCE>.openai.azure.com
			deployment_id: <DEPLOYMENT_ID>
			api_key: <API_KEY>
			api_version: <API_VERSION>
		openai:
			api_key:
			api_endpoint:
		gemini:
			api_key:
		cloudflare_azure:
			endpoint: https://gateway.ai.cloudflare.com/v1/<CF_GATEWAY_ID>/<CF_GATEWAY_NAME>
			api_key: <AZURE_API_KEY>
			resource: <AZURE_OPENAI_RESOURCE>
			deployment_id: <AZURE_OPENAI_DEPLOYMENT_ID>
			api_version: <AZURE_OPENAI_API_VERSION>

func ParseConfig

func ParseConfig(conf map[string]any) (config *Config, err error)

ParseConfig parses the AI config from conf

type ExchangeMetadataFunc added in v1.18.3

type ExchangeMetadataFunc func(credential string) (metadata.M, error)

ExchangeMetadataFunc is used to exchange metadata

type LLMProvider

type LLMProvider interface {
	// Name returns the name of the llm provider
	Name() string
	// GetChatCompletions returns the chat completions.
	GetChatCompletions(context.Context, openai.ChatCompletionRequest, metadata.M) (openai.ChatCompletionResponse, error)
	// GetChatCompletionsStream returns the chat completions in stream.
	GetChatCompletionsStream(context.Context, openai.ChatCompletionRequest, metadata.M) (ResponseRecver, error)
}

LLMProvider provides an interface to the llm providers

func GetDefaultProvider

func GetDefaultProvider() (LLMProvider, error)

GetDefaultProvider returns the default llm provider

func GetProvider

func GetProvider(name string) LLMProvider

GetProvider returns the llm provider by name

func GetProviderAndSetDefault

func GetProviderAndSetDefault(name string) (LLMProvider, error)

GetProviderAndSetDefault returns the llm provider by name and set it as the default provider

type Provider

type Provider = map[string]string

Provider is the configuration of llm provider

type ResponseRecver added in v1.18.7

type ResponseRecver interface {
	// Recv is the receive function.
	Recv() (response openai.ChatCompletionStreamResponse, err error)
}

ResponseRecver receives stream response.

type Server

type Server struct {
	Addr     string `yaml:"addr"`     // Addr is the address of the server
	Provider string `yaml:"provider"` // Provider is the llm provider to use
}

Server is the configuration of the BasicAPIServer, which is the endpoint for end user access

type Service

type Service struct {
	Metadata metadata.M

	LLMProvider
	// contains filtered or unexported fields
}

Service is used to invoke LLM Provider to get the functions to be executed, then, use source to send arguments which returned by llm provider to target function. Finally, use reducer to aggregate all the results, and write the result by the http.ResponseWriter.

func FromServiceContext added in v1.18.3

func FromServiceContext(ctx context.Context) *Service

FromServiceContext returns the service from the request context

func LoadOrCreateService added in v1.18.3

func LoadOrCreateService(credential string, zipperAddr string, aiProvider LLMProvider, exFn ExchangeMetadataFunc) (*Service, error)

LoadOrCreateService loads or creates a new AI service, if the service is already created, it will return the existing one

func (*Service) GetChatCompletions

func (s *Service) GetChatCompletions(ctx context.Context, req openai.ChatCompletionRequest, reqID string, w http.ResponseWriter, includeCallStack bool) error

GetChatCompletions returns the llm api response

func (*Service) GetInvoke added in v1.18.7

func (s *Service) GetInvoke(ctx context.Context, userInstruction string, baseSystemMessage string, reqID string, includeCallStack bool) (*ai.InvokeResponse, error)

GetInvoke returns the invoke response

func (*Service) GetOverview

func (s *Service) GetOverview() (*ai.OverviewResponse, error)

GetOverview returns the overview of the AI functions, key is the tag, value is the function definition

func (*Service) Release

func (s *Service) Release()

Release releases the resources

func (*Service) SetSystemPrompt added in v1.18.7

func (s *Service) SetSystemPrompt(prompt string)

SetSystemPrompt sets the system prompt

func (*Service) Write

func (s *Service) Write(tag uint32, data []byte) error

Write writes the data to zipper

Directories

Path Synopsis
provider
azopenai
Package azopenai is used to provide the Azure OpenAI service
Package azopenai is used to provide the Azure OpenAI service
cfazure
Package cfazure is used to provide the Azure OpenAI service
Package cfazure is used to provide the Azure OpenAI service
cfopenai
Package cfopenai is used to provide the Azure OpenAI service
Package cfopenai is used to provide the Azure OpenAI service
openai
Package openai is the OpenAI llm provider
Package openai is the OpenAI llm provider
Package register provides a register for registering and unregistering functions
Package register provides a register for registering and unregistering functions

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL