Documentation ¶
Overview ¶
package googleai implements a langchaingo provider for Google AI LLMs. See https://ai.google.dev/ for more details.
Index ¶
- Constants
- Variables
- type GoogleAI
- func (g *GoogleAI) Call(ctx context.Context, prompt string, options ...llms.CallOption) (string, error)
- func (g *GoogleAI) CreateEmbedding(ctx context.Context, texts []string) ([][]float32, error)
- func (g *GoogleAI) GenerateContent(ctx context.Context, messages []llms.MessageContent, ...) (*llms.ContentResponse, error)
- type HarmBlockThreshold
- type Option
- func WithAPIKey(apiKey string) Option
- func WithCloudLocation(l string) Option
- func WithCloudProject(p string) Option
- func WithCredentialsFile(credentialsFile string) Option
- func WithCredentialsJSON(credentialsJSON []byte) Option
- func WithDefaultCandidateCount(defaultCandidateCount int) Option
- func WithDefaultEmbeddingModel(defaultEmbeddingModel string) Option
- func WithDefaultMaxTokens(maxTokens int) Option
- func WithDefaultModel(defaultModel string) Option
- func WithDefaultTemperature(defaultTemperature float64) Option
- func WithDefaultTopK(defaultTopK int) Option
- func WithDefaultTopP(defaultTopP float64) Option
- func WithHTTPClient(httpClient *http.Client) Option
- func WithHarmThreshold(ht HarmBlockThreshold) Option
- func WithRest() Option
- type Options
Constants ¶
const ( CITATIONS = "citations" SAFETY = "safety" RoleSystem = "system" RoleModel = "model" RoleUser = "user" RoleTool = "tool" )
Variables ¶
Functions ¶
This section is empty.
Types ¶
type GoogleAI ¶
type GoogleAI struct { CallbacksHandler callbacks.Handler // contains filtered or unexported fields }
GoogleAI is a type that represents a Google AI API client.
func (*GoogleAI) Call ¶
func (g *GoogleAI) Call(ctx context.Context, prompt string, options ...llms.CallOption) (string, error)
Call implements the llms.Model interface.
func (*GoogleAI) CreateEmbedding ¶
CreateEmbedding creates embeddings from texts.
func (*GoogleAI) GenerateContent ¶
func (g *GoogleAI) GenerateContent( ctx context.Context, messages []llms.MessageContent, options ...llms.CallOption, ) (*llms.ContentResponse, error)
GenerateContent implements the llms.Model interface.
type HarmBlockThreshold ¶
type HarmBlockThreshold int32
const ( // HarmBlockUnspecified means threshold is unspecified. HarmBlockUnspecified HarmBlockThreshold = 0 // HarmBlockLowAndAbove means content with NEGLIGIBLE will be allowed. HarmBlockLowAndAbove HarmBlockThreshold = 1 // HarmBlockMediumAndAbove means content with NEGLIGIBLE and LOW will be allowed. HarmBlockMediumAndAbove HarmBlockThreshold = 2 // HarmBlockOnlyHigh means content with NEGLIGIBLE, LOW, and MEDIUM will be allowed. HarmBlockOnlyHigh HarmBlockThreshold = 3 // HarmBlockNone means all content will be allowed. HarmBlockNone HarmBlockThreshold = 4 )
type Option ¶
type Option func(*Options)
func WithAPIKey ¶
WithAPIKey passes the API KEY (token) to the client. This is useful for googleai clients.
func WithCloudLocation ¶
WithCloudLocation passes the GCP cloud location (region) name to the client. This is useful for vertex clients.
func WithCloudProject ¶
WithCloudProject passes the GCP cloud project name to the client. This is useful for vertex clients.
func WithCredentialsFile ¶
WithCredentialsFile append a ClientOption that authenticates API calls with the given service account or refresh token JSON credentials file.
func WithCredentialsJSON ¶
WithCredentialsJSON append a ClientOption that authenticates API calls with the given service account or refresh token JSON credentials.
func WithDefaultCandidateCount ¶
WithDefaultCandidateCount sets the candidate count for the model.
func WithDefaultEmbeddingModel ¶
WithDefaultModel passes a default embedding model name to the client. This model name is used if not explicitly provided in specific client invocations.
func WithDefaultMaxTokens ¶
WithDefaultMaxTokens sets the maximum token count for the model.
func WithDefaultModel ¶
WithDefaultModel passes a default content model name to the client. This model name is used if not explicitly provided in specific client invocations.
func WithDefaultTemperature ¶
WithDefaultTemperature sets the maximum token count for the model.
func WithDefaultTopK ¶
WithDefaultTopK sets the TopK for the model.
func WithDefaultTopP ¶
WithDefaultTopP sets the TopP for the model.
func WithHTTPClient ¶
WithHTTPClient append a ClientOption that uses the provided HTTP client to make requests. This is useful for vertex clients.
func WithHarmThreshold ¶
func WithHarmThreshold(ht HarmBlockThreshold) Option
WithHarmThreshold sets the safety/harm setting for the model, potentially limiting any harmful content it may generate.
type Options ¶
type Options struct { CloudProject string CloudLocation string DefaultModel string DefaultEmbeddingModel string DefaultCandidateCount int DefaultMaxTokens int DefaultTemperature float64 DefaultTopK int DefaultTopP float64 HarmThreshold HarmBlockThreshold ClientOptions []option.ClientOption }
Options is a set of options for GoogleAI and Vertex clients.
func DefaultOptions ¶
func DefaultOptions() Options
func (*Options) EnsureAuthPresent ¶
func (o *Options) EnsureAuthPresent()
EnsureAuthPresent attempts to ensure that the client has authentication information. If it does not, it will attempt to use the GOOGLE_API_KEY environment variable.
Directories ¶
Path | Synopsis |
---|---|
internal
|
|
cmd
Code generator for vertex.go from googleai.go nolint
|
Code generator for vertex.go from googleai.go nolint |
package palm implements a langchaingo provider for Google Vertex AI legacy PaLM models.
|
package palm implements a langchaingo provider for Google Vertex AI legacy PaLM models. |
package vertex implements a langchaingo provider for Google Vertex AI LLMs, including the new Gemini models.
|
package vertex implements a langchaingo provider for Google Vertex AI LLMs, including the new Gemini models. |