Documentation ¶
Overview ¶
Package openai is the OpenAI llm provider
Index ¶
- Constants
- type Provider
- func (p *Provider) GetChatCompletions(ctx context.Context, req openai.ChatCompletionRequest, _ metadata.M) (openai.ChatCompletionResponse, error)
- func (p *Provider) GetChatCompletionsStream(ctx context.Context, req openai.ChatCompletionRequest, _ metadata.M) (bridgeai.ResponseRecver, error)
- func (p *Provider) Name() string
Constants ¶
View Source
const APIEndpoint = "https://api.openai.com/v1/chat/completions"
APIEndpoint is the endpoint for OpenAI
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Provider ¶ added in v1.18.7
type Provider struct { // APIKey is the API key for OpenAI APIKey string // Model is the model for OpenAI // eg. "gpt-3.5-turbo-1106", "gpt-4-turbo-preview", "gpt-4-vision-preview", "gpt-4" Model string // contains filtered or unexported fields }
Provider is the provider for OpenAI
func NewProvider ¶
NewProvider creates a new OpenAIProvider
func (*Provider) GetChatCompletions ¶ added in v1.18.7
func (p *Provider) GetChatCompletions(ctx context.Context, req openai.ChatCompletionRequest, _ metadata.M) (openai.ChatCompletionResponse, error)
GetChatCompletions implements ai.LLMProvider.
func (*Provider) GetChatCompletionsStream ¶ added in v1.18.7
func (p *Provider) GetChatCompletionsStream(ctx context.Context, req openai.ChatCompletionRequest, _ metadata.M) (bridgeai.ResponseRecver, error)
GetChatCompletionsStream implements ai.LLMProvider.
Click to show internal directories.
Click to hide internal directories.