Documentation ¶
Overview ¶
Package azopenai is used to provide the Azure OpenAI service
Index ¶
- type Provider
- func (p *Provider) GetChatCompletions(ctx context.Context, req openai.ChatCompletionRequest, _ metadata.M) (openai.ChatCompletionResponse, error)
- func (p *Provider) GetChatCompletionsStream(ctx context.Context, req openai.ChatCompletionRequest, _ metadata.M) (ai.ResponseRecver, error)
- func (p *Provider) Name() string
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Provider ¶ added in v1.18.7
type Provider struct { APIKey string APIEndpoint string DeploymentID string APIVersion string // contains filtered or unexported fields }
Provider is the provider for Azure OpenAI
func NewProvider ¶ added in v1.18.1
func NewProvider(apiKey string, apiEndpoint string, deploymentID string, apiVersion string) *Provider
NewProvider creates a new AzureOpenAIProvider
func (*Provider) GetChatCompletions ¶ added in v1.18.7
func (p *Provider) GetChatCompletions(ctx context.Context, req openai.ChatCompletionRequest, _ metadata.M) (openai.ChatCompletionResponse, error)
GetChatCompletions get chat completions for ai service
func (*Provider) GetChatCompletionsStream ¶ added in v1.18.7
func (p *Provider) GetChatCompletionsStream(ctx context.Context, req openai.ChatCompletionRequest, _ metadata.M) (ai.ResponseRecver, error)
GetChatCompletionsStream implements ai.LLMProvider.
Click to show internal directories.
Click to hide internal directories.