llamacpp

package
v1.0.10 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 1, 2024 License: MIT Imports: 8 Imported by: 0

Documentation

Index

Constants

View Source
const (
	RoleSystem    = "system"
	RoleAssistant = "assistant"
	RoleUser      = "user"
	RoleFunction  = "function"
	RoleTool      = "tool"
)
View Source
const (
	DefaultAPIVersion = "2023-05-15"
)

Variables

View Source
var (
	ErrEmptyResponse              = errors.New("no response")
	ErrMissingToken               = errors.New("missing the OpenAI API key, set it in the OPENAI_API_KEY environment variable") //nolint:lll
	ErrMissingAzureModel          = errors.New("model needs to be provided when using Azure API")
	ErrMissingAzureEmbeddingModel = errors.New("embeddings model needs to be provided when using Azure API")

	ErrUnexpectedResponseLength = errors.New("unexpected length of response")
)
View Source
var ResponseFormatJSON = &ResponseFormat{Type: "json_object"} //nolint:gochecknoglobals

ResponseFormatJSON is the JSON response format.

Functions

func ExtractToolParts

func ExtractToolParts(msg *ChatMessage) ([]llms.ContentPart, []llms.ToolCall)

ExtractToolParts extracts the tool parts from a message.

Types

type APIType

type APIType llamacppclient.APIType

type ChatMessage

type ChatMessage = llamacppclient.ChatMessage

type LLM

type LLM struct {
	CallbacksHandler callbacks.Handler
	// contains filtered or unexported fields
}

func New

func New(opts ...Option) (*LLM, error)

New returns a new OpenAI LLM.

func (*LLM) Call

func (o *LLM) Call(ctx context.Context, prompt string, options ...llms.CallOption) (string, error)

Call requests a completion for the given prompt.

func (*LLM) CreateEmbedding

func (o *LLM) CreateEmbedding(ctx context.Context, inputTexts []string) ([][]float32, error)

CreateEmbedding creates embeddings for the given input texts.

func (*LLM) GenerateContent

func (o *LLM) GenerateContent(ctx context.Context, messages []llms.MessageContent, options ...llms.CallOption) (*llms.ContentResponse, error)

GenerateContent implements the Model interface.

type Option

type Option func(*options)

Option is a functional option for the OpenAI client.

func WithAPIType

func WithAPIType(apiType APIType) Option

WithAPIType passes the api type to the client. If not set, the default value is APITypeOpenAI.

func WithAPIVersion

func WithAPIVersion(apiVersion string) Option

WithAPIVersion passes the api version to the client. If not set, the default value is DefaultAPIVersion.

func WithBaseURL

func WithBaseURL(baseURL string) Option

WithBaseURL passes the OpenAI base url to the client. If not set, the base url is read from the OPENAI_BASE_URL environment variable. If still not set in ENV VAR OPENAI_BASE_URL, then the default value is https://api.openai.com/v1 is used.

func WithCallback

func WithCallback(callbackHandler callbacks.Handler) Option

WithCallback allows setting a custom Callback Handler.

func WithEmbeddingModel

func WithEmbeddingModel(embeddingModel string) Option

WithEmbeddingModel passes the OpenAI model to the client. Required when ApiType is Azure.

func WithHTTPClient

func WithHTTPClient(client llamacppclient.Doer) Option

WithHTTPClient allows setting a custom HTTP client. If not set, the default value is http.DefaultClient.

func WithModel

func WithModel(model string) Option

WithModel passes the OpenAI model to the client. If not set, the model is read from the OPENAI_MODEL environment variable. Required when ApiType is Azure.

func WithOrganization

func WithOrganization(organization string) Option

WithOrganization passes the OpenAI organization to the client. If not set, the organization is read from the OPENAI_ORGANIZATION.

func WithResponseFormat

func WithResponseFormat(responseFormat *ResponseFormat) Option

WithResponseFormat allows setting a custom response format.

func WithToken

func WithToken(token string) Option

WithToken passes the OpenAI API token to the client. If not set, the token is read from the OPENAI_API_KEY environment variable.

type ResponseFormat

type ResponseFormat = llamacppclient.ResponseFormat

ResponseFormat is the response format for the OpenAI client.

Directories

Path Synopsis
internal

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL