Documentation
¶
Index ¶
- Constants
- Variables
- func EmbedTexts(ctx context.Context, appState *models.AppState, model *models.EmbeddingModel, ...) ([][]float32, error)
- func EmbedTextsOpenAI(ctx context.Context, appState *models.AppState, texts []string) ([][]float32, error)
- func GetLLMModelName(cfg *config.Config) (string, error)
- func GetMessageEmbeddingModel(appState *models.AppState) (*models.EmbeddingModel, error)
- func GetTokenCount(text string) (int, error)
- func NewOpenAIRetryClient(cfg *config.Config) *openairetryclient.OpenAIRetryClient
- func RunChatCompletion(ctx context.Context, appState *models.AppState, summaryMaxTokens int, ...) (resp openai.ChatCompletionResponse, err error)
- type LLMError
Constants ¶
View Source
const DefaultTemperature = 0.0
View Source
const InvalidLLMModelError = "llm model is not set or is invalid"
View Source
const OpenAIAPIKeyNotSetError = "ZEP_OPENAI_API_KEY is not set" //nolint:gosec
Variables ¶
View Source
var MaxLLMTokensMap = map[string]int{
"gpt-3.5-turbo": 4096,
"gpt-3.5-turbo0301": 8192,
"gpt-4": 8192,
}
Functions ¶
func EmbedTexts ¶ added in v0.6.5
func EmbedTextsOpenAI ¶ added in v0.6.5
func GetMessageEmbeddingModel ¶ added in v0.6.5
func GetMessageEmbeddingModel(appState *models.AppState) (*models.EmbeddingModel, error)
func GetTokenCount ¶
func NewOpenAIRetryClient ¶ added in v0.4.6
func NewOpenAIRetryClient(cfg *config.Config) *openairetryclient.OpenAIRetryClient
Types ¶
Source Files
¶
Click to show internal directories.
Click to hide internal directories.