Documentation ¶
Overview ¶
package googleai implements a langchaingo provider for Google AI LLMs. See https://ai.google.dev/ for more details.
Index ¶
- Constants
- Variables
- type GoogleAI
- func (g *GoogleAI) Call(ctx context.Context, prompt string, options ...llms.CallOption) (string, error)
- func (g *GoogleAI) CreateEmbedding(ctx context.Context, texts []string) ([][]float32, error)
- func (g *GoogleAI) GenerateContent(ctx context.Context, messages []llms.MessageContent, ...) (*llms.ContentResponse, error)
- type HarmBlockThreshold
- type Option
- func WithAPIKey(apiKey string) Option
- func WithCloudLocation(l string) Option
- func WithCloudProject(p string) Option
- func WithCredentialsFile(credentialsFile string) Option
- func WithCredentialsJSON(credentialsJSON []byte) Option
- func WithDefaultCandidateCount(defaultCandidateCount int) Option
- func WithDefaultEmbeddingModel(defaultEmbeddingModel string) Option
- func WithDefaultMaxTokens(maxTokens int) Option
- func WithDefaultModel(defaultModel string) Option
- func WithDefaultTemperature(defaultTemperature float64) Option
- func WithDefaultTopK(defaultTopK int) Option
- func WithDefaultTopP(defaultTopP float64) Option
- func WithHTTPClient(httpClient *http.Client) Option
- func WithHarmThreshold(ht HarmBlockThreshold) Option
- func WithRest() Option
- type Options
Constants ¶
const ( CITATIONS = "citations" SAFETY = "safety" RoleSystem = "system" RoleModel = "model" RoleUser = "user" RoleTool = "tool" )
Variables ¶
Functions ¶
This section is empty.
Types ¶
type GoogleAI ¶
type GoogleAI struct { CallbacksHandler callbacks.Handler // contains filtered or unexported fields }
GoogleAI is a type that represents a Google AI API client.
func (*GoogleAI) Call ¶
func (g *GoogleAI) Call(ctx context.Context, prompt string, options ...llms.CallOption) (string, error)
Call implements the llms.Model interface.
func (*GoogleAI) CreateEmbedding ¶
CreateEmbedding creates embeddings from texts.
func (*GoogleAI) GenerateContent ¶
func (g *GoogleAI) GenerateContent( ctx context.Context, messages []llms.MessageContent, options ...llms.CallOption, ) (*llms.ContentResponse, error)
GenerateContent implements the llms.Model interface.
type HarmBlockThreshold ¶ added in v0.1.9
type HarmBlockThreshold int32
const ( // HarmBlockUnspecified means threshold is unspecified. HarmBlockUnspecified HarmBlockThreshold = 0 // HarmBlockLowAndAbove means content with NEGLIGIBLE will be allowed. HarmBlockLowAndAbove HarmBlockThreshold = 1 // HarmBlockMediumAndAbove means content with NEGLIGIBLE and LOW will be allowed. HarmBlockMediumAndAbove HarmBlockThreshold = 2 // HarmBlockOnlyHigh means content with NEGLIGIBLE, LOW, and MEDIUM will be allowed. HarmBlockOnlyHigh HarmBlockThreshold = 3 // HarmBlockNone means all content will be allowed. HarmBlockNone HarmBlockThreshold = 4 )
type Option ¶
type Option func(*Options)
func WithAPIKey ¶
WithAPIKey passes the API KEY (token) to the client. This is useful for googleai clients.
func WithCloudLocation ¶ added in v0.1.9
WithCloudLocation passes the GCP cloud location (region) name to the client. This is useful for vertex clients.
func WithCloudProject ¶ added in v0.1.9
WithCloudProject passes the GCP cloud project name to the client. This is useful for vertex clients.
func WithCredentialsFile ¶ added in v0.1.11
WithCredentialsFile append a ClientOption that authenticates API calls with the given service account or refresh token JSON credentials file.
func WithCredentialsJSON ¶ added in v0.1.11
WithCredentialsJSON append a ClientOption that authenticates API calls with the given service account or refresh token JSON credentials.
func WithDefaultCandidateCount ¶ added in v0.1.10
WithDefaultCandidateCount sets the candidate count for the model.
func WithDefaultEmbeddingModel ¶
WithDefaultModel passes a default embedding model name to the client. This model name is used if not explicitly provided in specific client invocations.
func WithDefaultMaxTokens ¶ added in v0.1.10
WithDefaultMaxTokens sets the maximum token count for the model.
func WithDefaultModel ¶
WithDefaultModel passes a default content model name to the client. This model name is used if not explicitly provided in specific client invocations.
func WithDefaultTemperature ¶ added in v0.1.10
WithDefaultTemperature sets the maximum token count for the model.
func WithDefaultTopK ¶ added in v0.1.10
WithDefaultTopK sets the TopK for the model.
func WithDefaultTopP ¶ added in v0.1.10
WithDefaultTopP sets the TopP for the model.
func WithHTTPClient ¶ added in v0.1.11
WithHTTPClient append a ClientOption that uses the provided HTTP client to make requests. This is useful for vertex clients.
func WithHarmThreshold ¶ added in v0.1.9
func WithHarmThreshold(ht HarmBlockThreshold) Option
WithHarmThreshold sets the safety/harm setting for the model, potentially limiting any harmful content it may generate.
type Options ¶ added in v0.1.9
type Options struct { CloudProject string CloudLocation string DefaultModel string DefaultEmbeddingModel string DefaultCandidateCount int DefaultMaxTokens int DefaultTemperature float64 DefaultTopK int DefaultTopP float64 HarmThreshold HarmBlockThreshold ClientOptions []option.ClientOption }
Options is a set of options for GoogleAI and Vertex clients.
func DefaultOptions ¶ added in v0.1.9
func DefaultOptions() Options
func (*Options) EnsureAuthPresent ¶ added in v0.1.12
func (o *Options) EnsureAuthPresent()
EnsureAuthPresent attempts to ensure that the client has authentication information. If it does not, it will attempt to use the GOOGLE_API_KEY environment variable.
Directories ¶
Path | Synopsis |
---|---|
internal
|
|
cmd
Code generator for vertex.go from googleai.go nolint
|
Code generator for vertex.go from googleai.go nolint |
package palm implements a langchaingo provider for Google Vertex AI legacy PaLM models.
|
package palm implements a langchaingo provider for Google Vertex AI legacy PaLM models. |
package vertex implements a langchaingo provider for Google Vertex AI LLMs, including the new Gemini models.
|
package vertex implements a langchaingo provider for Google Vertex AI LLMs, including the new Gemini models. |