provider

package
v1.18.17 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 5, 2024 License: Apache-2.0 Imports: 10 Imported by: 0

Documentation

Overview

Package provider defines the ai.Provider interface and provides a mock provider for unittest.

Index

Constants

This section is empty.

Variables

View Source
var ErrNotExistsProvider = errors.New("llm provider does not exist")

ErrNotExistsProvider is the error when the provider does not exist

Functions

func ListProviders

func ListProviders() []string

ListProviders returns the list of llm providers

func RegisterProvider

func RegisterProvider(provider LLMProvider)

RegisterProvider registers the llm provider

Types

type LLMProvider

type LLMProvider interface {
	// Name returns the name of the llm provider
	Name() string
	// GetChatCompletions returns the chat completions.
	GetChatCompletions(context.Context, openai.ChatCompletionRequest, metadata.M) (openai.ChatCompletionResponse, error)
	// GetChatCompletionsStream returns the chat completions in stream.
	GetChatCompletionsStream(context.Context, openai.ChatCompletionRequest, metadata.M) (ResponseRecver, error)
}

LLMProvider provides an interface to the llm providers

func GetProvider

func GetProvider(name string) (LLMProvider, error)

GetProvider returns the llm provider by name

type Mock

type Mock struct {
	// contains filtered or unexported fields
}

Mock implements the ai.Provider interface. And it can be used for recording requests and mocking responses.

func NewMock

func NewMock(name string, data ...MockData) (*Mock, error)

NewMock returns a mock provider.

func (*Mock) GetChatCompletions

func (m *Mock) GetChatCompletions(_ context.Context, req openai.ChatCompletionRequest, _ metadata.M) (openai.ChatCompletionResponse, error)

GetChatCompletions implements the ai.Provider interface.

func (*Mock) GetChatCompletionsStream

func (m *Mock) GetChatCompletionsStream(_ context.Context, req openai.ChatCompletionRequest, _ metadata.M) (ResponseRecver, error)

GetChatCompletionsStream implements the ai.Provider interface.

func (*Mock) Name

func (m *Mock) Name() string

Name returns the provider name.

func (*Mock) RequestRecords

func (m *Mock) RequestRecords() []openai.ChatCompletionRequest

RequestRecords returns the request records.

type MockData

type MockData interface {
	// contains filtered or unexported methods
}

MockData supplys mock response data to the mock provider.

func MockChatCompletionResponse

func MockChatCompletionResponse(str ...string) MockData

MockChatCompletionResponse supplys mock response data to the mock provider.

func MockChatCompletionStreamResponse

func MockChatCompletionStreamResponse(str ...string) MockData

MockChatCompletionStreamResponse supplys mock response data in form of stream to the mock provider.

type ResponseRecver

type ResponseRecver interface {
	// Recv is the receive function.
	Recv() (response openai.ChatCompletionStreamResponse, err error)
}

ResponseRecver receives stream response.

Directories

Path Synopsis
Package anthropic is the anthropic llm provider, see https://docs.anthropic.com
Package anthropic is the anthropic llm provider, see https://docs.anthropic.com
Package azopenai is used to provide the Azure OpenAI service
Package azopenai is used to provide the Azure OpenAI service
Package cerebras is the Cerebras llm provider
Package cerebras is the Cerebras llm provider
Package cfazure is used to provide the Azure OpenAI service
Package cfazure is used to provide the Azure OpenAI service
Package cfopenai is used to provide the Azure OpenAI service
Package cfopenai is used to provide the Azure OpenAI service
Package gemini is used to provide the gemini service
Package gemini is used to provide the gemini service
Package githubmodels is the Github Models llm provider, see https://github.com/marketplace/models
Package githubmodels is the Github Models llm provider, see https://github.com/marketplace/models
Package ollama is used to provide the Ollama service for YoMo Bridge.
Package ollama is used to provide the Ollama service for YoMo Bridge.
Package openai is the OpenAI llm provider
Package openai is the OpenAI llm provider
Package xai is the x.ai provider
Package xai is the x.ai provider

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL