Documentation ¶
Index ¶
- Constants
- type Gpt
- type Option
- func WithAzureOpenAI(token, endpoint, model, alias string) Option
- func WithMaxChunkSize(maxChunkSize int) Option
- func WithMaxTokens(maxTokens int) Option
- func WithOpenAI(token, model string) Option
- func WithStream(stream bool) Option
- func WithTemperature(temperature float32) Option
- func WithTopP(topP float32) Option
- type Stats
Constants ¶
View Source
const ( SummarizeFileTemplate = "summarize_file.tmpl" SummarizeDiffTemplate = "summarize_diff.tmpl" PrevChunkSummaryTemplate = "prev_chunk_summary.tmpl" SummarizeChangesTemplate = "summarize_changes.tmpl" FinalizeCommitMsgTemplate = "finalize_commit_msg.tmpl" HookPrepareCommitMessageTemplate = "prepare-commit-msg.tmpl" )
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Gpt ¶
type Gpt interface { SummarizeFile(ctx context.Context, op git.GitOperation, fileName, fileContent string) (string, error) SummarizeDiff(ctx context.Context, fileName, diff string) (string, error) SummarizeChanges(ctx context.Context, changes []string) (string, error) FinalizeCommitMsg(ctx context.Context, prompt string) (string, error) GetStats(ctx context.Context) *Stats }
type Option ¶
type Option func(*client)
func WithAzureOpenAI ¶
func WithMaxChunkSize ¶
func WithMaxTokens ¶
func WithOpenAI ¶
func WithStream ¶
func WithTemperature ¶
A low temperature makes the model more confident in its top choices, while temperatures greater than 1 decrease confidence in its top choices. An even higher temperature corresponds to more uniform sampling (total randomness). A temperature of 0 is equivalent to argmax/max likelihood, or the highest probability token.
Click to show internal directories.
Click to hide internal directories.