Documentation ¶
Index ¶
- func File(path string) []byte
- func Folder(root string, includeFilter ...string) []byte
- func Image(i image.Image) []byte
- type LanguageModel
- type Oracle
- func (o *Oracle) Ask(question string, references ...any) (string, error)
- func (o *Oracle) AskWithContext(ctx context.Context, question string, references ...any) (string, error)
- func (o *Oracle) Forget() *Oracle
- func (o *Oracle) GiveExample(givenInput string, idealCompletion string)
- func (o *Oracle) Remember() *Oracle
- func (o *Oracle) Reset()
- func (o *Oracle) SetPurpose(purpose string)
- func (o *Oracle) SetResponseFormat(fieldname, description string)
- func (o *Oracle) WithModel(model string) error
- type Prompt
Examples ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func File ¶
A Reference helper that reads a file from disk and returns the contents as an array of bytes. Content will be a snapshot of the file at the time of calling. Consider calling inside the Ask method or using a closure to ensure lazy evaluation if you're going to be editing read file in place.
func Folder ¶
A Reference helper that takes a folder in a filesystem and returns the contents of all files in that folder as an array of bytes. Content will be a snapshot of the files at the time of calling. Call is recursive, so be careful with what you include. Consider adding one of more filters to the includeFilter such as ".go" to only include certain files or similar globs.
Types ¶
type LanguageModel ¶
type LanguageModel interface {
Completion(ctx context.Context, prompt client.Prompt) (io.Reader, error)
}
LanguageModel is an interface that abstracts a concrete implementation of our language model API call.
type Oracle ¶
type Oracle struct {
// contains filtered or unexported fields
}
Oracle is a struct that scaffolds a well formed Oracle, designed in a way that facilitates the asking of one or many questions to an underlying Large Language Model.
func NewAnthropicOracle ¶
func NewChatGPTOracle ¶
NewChatGPTOracle takes an OpenAI API token and sets up a new ChatGPT Oracle with sensible defaults.
func NewOllamaOracle ¶
func NewOracle ¶
func NewOracle(client LanguageModel) *Oracle
NewOracle returns a new Oracle with sensible defaults.
func (*Oracle) Ask ¶
Ask asks the Oracle a question, and returns the response from the underlying Large Language Model. Ask massages the query and supporting references into a standardised format that is relatively generalisable across models.
Example (StandardTextCompletion) ¶
package main import ( "fmt" "github.com/mr-joshcrane/goracle" "github.com/mr-joshcrane/goracle/client" ) func main() { // Basic request response text flow c := client.NewDummyClient("A friendly LLM response!", nil) o := goracle.NewOracle(c) answer, err := o.Ask("A user question") if err != nil { panic(err) } fmt.Println(answer) }
Output: A friendly LLM response!
Example (WithConversationMemory) ¶
package main import ( "fmt" "github.com/mr-joshcrane/goracle" "github.com/mr-joshcrane/goracle/client" ) func main() { // For when you want the responses to be Stateful // and depend on the previous answers // this is the default and matches the typical chatbot experience c := client.NewDummyClient("We talked about the answer to life, the universe, and everything", nil) o := goracle.NewOracle(c) // This is the default, but can be set manually o.Remember() _, err := o.Ask("What is the answer to life, the universe, and everything?") if err != nil { panic(err) } answer, err := o.Ask("What have we already talked about?") if err != nil { panic(err) } fmt.Println(answer) }
Output: We talked about the answer to life, the universe, and everything
Example (WithExamples) ¶
package main import ( "fmt" "github.com/mr-joshcrane/goracle" "github.com/mr-joshcrane/goracle/client" ) func main() { // Examples allow you to guide the LLM with n-shot learning c := client.NewDummyClient("42", nil) o := goracle.NewOracle(c) o.GiveExample("Fear is the...", "mind killer") o.GiveExample("With great power comes...", "great responsibility") answer, err := o.Ask("What is the answer to life, the universe, and everything?") if err != nil { panic(err) } fmt.Println(answer) }
Output: 42
Example (WithReferences) ¶
package main import ( "fmt" "image" "github.com/mr-joshcrane/goracle" "github.com/mr-joshcrane/goracle/client" ) func main() { // Basic request response text flow with multi-modal references c := client.NewDummyClient("Yes. There is a reference to swiss cheese in cheeseDocs/swiss.txt", nil) o := goracle.NewOracle(c) nonCheeseImage := image.NewRGBA(image.Rect(0, 0, 100, 100)) answer, err := o.Ask( "My question for you is, do any of my references make mention of swiss cheese?", "Some long chunk of text, that is notably non related", goracle.File("invoice.txt"), nonCheeseImage, goracle.Folder("~/cheeseDocs/"), ) if err != nil { panic(err) } fmt.Println(answer) }
Output: Yes. There is a reference to swiss cheese in cheeseDocs/swiss.txt
Example (WithoutConversationMemory) ¶
package main import ( "fmt" "github.com/mr-joshcrane/goracle" "github.com/mr-joshcrane/goracle/client" ) func main() { // For when you want the responses to be Staskteless // and not depend on the previous answers/examples c := client.NewDummyClient("Nothing so far", nil) o := goracle.NewOracle(c) // o.Forget() _, err := o.Ask("What is the answer to life, the universe, and everything?") if err != nil { panic(err) } answer, err := o.Ask("What have we already talked about?") if err != nil { panic(err) } fmt.Println(answer) }
Output: Nothing so far
func (*Oracle) AskWithContext ¶
func (o *Oracle) AskWithContext(ctx context.Context, question string, references ...any) (string, error)
AskWithContext is similar to *Oracle.Ask but allows for a context to be passed in.
Example (WithTimeout) ¶
package main import ( "context" "fmt" "time" "github.com/mr-joshcrane/goracle" "github.com/mr-joshcrane/goracle/client" ) func main() { // For when you want to limit the amount of time the LLM has to respond c := client.NewDummyClient("A friendly LLM response!", nil) o := goracle.NewOracle(c) ctx, cancel := context.WithTimeout(context.Background(), 5*time.Second) defer cancel() answer, err := o.AskWithContext(ctx, "A user question") if err != nil { panic(err) } fmt.Println(answer) }
Output: A friendly LLM response!
func (*Oracle) Forget ¶
Forget [Oracles Oracle] forget the conversation history and do not keep track of the context of the conversation. This is useful for when you want to ask a single question without previous context affecting the answers.
func (*Oracle) GiveExample ¶
GiveExample adds an example to the list of examples. These examples used to guide the models response. Quality of the examples is more important than quantity here. Calling this method on a stateless Oracle will have no effect. This allows for stateless oracles to still benefit from n-shot learning.
func (*Oracle) Remember ¶
Remember [Oracles Oracle] remember the conversation history and keep track of the context of the conversation. This is the default behaviour. References are not persisted between calls in order to keep the prompt size down. If this behaviour is desired, you can pass the references with oracle.GiveExample like so: oracle.GiveExample(oracle.File("path/to/file", "<your preferred bot response>"))
func (*Oracle) Reset ¶
func (o *Oracle) Reset()
Reset clears the Oracle's previous chat history Useful for when you hit a context limit Doesn't affect the Oracle's purpose, or whether it's stateful or not
func (*Oracle) SetPurpose ¶
SetPurpose sets the purpose of the Oracle, which frames the models response.
func (*Oracle) SetResponseFormat ¶
type Prompt ¶
type Prompt struct { Purpose string InputHistory []string OutputHistory []string References [][]byte Question string ResponseFormat []string }
Prompt is a struct that scaffolds a well formed prompt, designed in a way that are ideal for Large Language Models. This is the abstraction we will pass through to the client library so it can be handled appropriately
func (Prompt) GetHistory ¶
GetHistory returns a list of examples that are used to guide the Models response. Quality of the examples is more important than quantity here.
func (Prompt) GetPurpose ¶
GetPurpose returns the purpose of the prompt, which frames the models response.
func (Prompt) GetQuestion ¶
GetQuestion returns the question that the user is asking the Model