Documentation
¶
Index ¶
- Constants
- Variables
- type Client
- func (client *Client) Ask(ctx context.Context, prompt string, shouldRetry bool, shouldQuit bool, ...) (err error)
- func (client *Client) GenerateCode(ctx context.Context, prompt string) (code string, err error)
- func (client *Client) SetFull(full bool) *Client
- func (client *Client) SetModel(model Model) *Client
- type Model
Constants ¶
const ( // ModelChatGPT represents the gpt-3.5-turbo model used by ChatGPT. ModelChatGPT = "gpt-3.5-turbo" // ModelTextDaVinci3 represents the text-davinci-003 language generation // model. ModelTextDaVinci3 = "text-davinci-003" // ModelCodeDaVinci2 represents the code-davinci-002 code generation model. ModelCodeDaVinci2 = "code-davinci-002" )
Variables ¶
var ( // ErrResultTruncated is returned when the OpenAI API returned a truncated // result. The reason for the truncation will be appended to the error // string. ErrResultTruncated = errors.New("result was truncated") // ErrNoResults is returned if the OpenAI API returned an empty result. This // should not generally happen. ErrNoResults = errors.New("no results return from API") // ErrUnsupportedModel is returned if the SetModel method is provided with // an unsupported model ErrUnsupportedModel = errors.New("unsupported model") // ErrUnexpectedStatus is returned when the OpenAI API returned a response // with an unexpected status code ErrUnexpectedStatus = errors.New("OpenAI returned unexpected response") // ErrRequestFailed is returned when the OpenAI API returned an error for // the request ErrRequestFailed = errors.New("request failed") )
var MaxTokens = 4096
MaxTokens is the maximum amount of tokens supported by the model used. Newer OpenAI models support a maximum of 4096 tokens.
var SupportedModels = []string{ModelChatGPT, ModelTextDaVinci3, ModelCodeDaVinci2}
SupportedModels is a list of all models supported by aiac
Functions ¶
This section is empty.
Types ¶
type Client ¶
type Client struct { *requests.HTTPClient // contains filtered or unexported fields }
Client is a structure used to continuously generate IaC code via OpenAPI/ChatGPT
func NewClient ¶
NewClient creates a new instance of the Client struct, with the provided input options. Neither the OpenAI API nor ChatGPT are yet contacted at this point.
func (*Client) Ask ¶
func (client *Client) Ask( ctx context.Context, prompt string, shouldRetry bool, shouldQuit bool, outputPath string, ) (err error)
Ask asks the OpenAI API to generate code based on the provided prompt. It is only meant to be used in command line applications (see GenerateCode for library usage). The generated code will always be printed to standard output, but may optionally be stored in the file whose path is provided by the outputPath argument. To only print to standard output, provide an empty string or a dash ("-") via outputPath. If shouldRetry is true, you will be prompted whether to regenerate the response after it is printed to standard output, in case you are unhappy with the response. If shouldQuit is true, the code is printed to standard output and the function returns, without storing to a file or asking whether to regenerate the response.
func (*Client) GenerateCode ¶
GenerateCode sends the provided prompt to the OpenAI API and returns the generated code.