Documentation ¶
Index ¶
- func NewApp(ctx context.Context, appSettings backend.AppInstanceSettings) (instancemgmt.Instance, error)
- type App
- func (a *App) CheckHealth(ctx context.Context, req *backend.CheckHealthRequest) (*backend.CheckHealthResult, error)
- func (a *App) Dispose()
- func (a *App) PublishStream(context.Context, *backend.PublishStreamRequest) (*backend.PublishStreamResponse, error)
- func (a *App) RunStream(ctx context.Context, req *backend.RunStreamRequest, ...) error
- func (a *App) SubscribeStream(ctx context.Context, req *backend.SubscribeStreamRequest) (*backend.SubscribeStreamResponse, error)
- type EventDone
- type EventError
- type LLMGatewaySettings
- type OpenAISettings
- type Settings
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func NewApp ¶
func NewApp(ctx context.Context, appSettings backend.AppInstanceSettings) (instancemgmt.Instance, error)
NewApp creates a new example *App instance.
Types ¶
type App ¶
type App struct { backend.CallResourceHandler // contains filtered or unexported fields }
App is an example app backend plugin which can respond to data queries.
func (*App) CheckHealth ¶
func (a *App) CheckHealth(ctx context.Context, req *backend.CheckHealthRequest) (*backend.CheckHealthResult, error)
CheckHealth handles health checks sent from Grafana to the plugin. It returns whether each feature is working based on the plugin settings.
func (*App) Dispose ¶
func (a *App) Dispose()
Dispose here tells plugin SDK that plugin wants to clean up resources when a new instance created.
func (*App) PublishStream ¶
func (a *App) PublishStream(context.Context, *backend.PublishStreamRequest) (*backend.PublishStreamResponse, error)
func (*App) RunStream ¶
func (a *App) RunStream(ctx context.Context, req *backend.RunStreamRequest, sender *backend.StreamSender) error
func (*App) SubscribeStream ¶
func (a *App) SubscribeStream(ctx context.Context, req *backend.SubscribeStreamRequest) (*backend.SubscribeStreamResponse, error)
type EventError ¶ added in v0.3.0
type EventError struct {
Error string `json:"error"`
}
type LLMGatewaySettings ¶ added in v0.6.0
type LLMGatewaySettings struct { // This is the URL of the LLM endpoint of the machine learning backend which proxies // the request to our llm-gateway. If empty, the gateway is disabled. URL string `json:"url"` }
LLMGatewaySettings contains the configuration for the Grafana Managed Key LLM solution.
type OpenAISettings ¶
type OpenAISettings struct { // The URL to the OpenAI provider URL string `json:"url"` // The OrgID to be passed to OpenAI in requests OrganizationID string `json:"organizationId"` // What OpenAI provider the user selected. Note this can specify using the LLMGateway Provider openAIProvider `json:"provider"` // Model mappings required for Azure's OpenAI AzureMapping [][]string `json:"azureModelMapping"` // contains filtered or unexported fields }
OpenAISettings contains the user-specified OpenAI connection details
type Settings ¶
type Settings struct { // Tenant is the stack ID (Hosted Grafana ID) of the instance this plugin // is running on. Tenant string // GrafanaComAPIKey is a grafana.com Editor API key used to interact with the grafana.com API. // // It is created by the grafana.com API when the plugin is first provisioned for a tenant. // // It is used when persisting the plugin's settings after setup. GrafanaComAPIKey string DecryptedSecureJSONData map[string]string EnableGrafanaManagedLLM bool `json:"enableGrafanaManagedLLM"` // OpenAI related settings OpenAI OpenAISettings `json:"openAI"` // VectorDB settings. May rely on OpenAI settings. Vector vector.VectorSettings `json:"vector"` // LLMGateway provides Grafana-managed OpenAI. LLMGateway LLMGatewaySettings `json:"llmGateway"` }
Settings contains the plugin's settings and secrets required by the plugin backend.
Click to show internal directories.
Click to hide internal directories.