Project: chat-gpeasy
This project provides to main goals:
- hooks the OpenAI API by creating a proxy interface which leverages the Go SDK
- an opinionated but easy to use wrapper for the Go OpenAI SDK that makes creating workflows in Go easier
This addresses issues raised with wanting to consume the OpenAI API but also preserve some level of content/data ownership with queries posed by clients using the API. You can find more details in this presentation (TODO: link to presentation) and slides.
Components
OpenAI Proxy
There is an example proxy service located in the ./cmd/bin
folder which implements the service hook interface by dumping the request/response parameters for each OpenAI API call to the console.
To run the example proxy:
$ cd ./cmd/bin
go run cmd.go
Then run one of the example clients in the ./examples
folder, such as the Cumulative client which by default will connect to the proxy on your localhost.
$ cd ./examples/cumulative
go run cmd.go
Opinionated OpenAI Clients
For convenience, this repository also implements a number of opinionated clients, which are called personas
, that wrap the Go OpenAI SDK by are conceptually easier to wrap your brain around.
Simple Persona
This provides a very simple Q&A client. 1 question gives you 1 answer. Each Question or Answer does not affect or influence the next.
To create a client:
persona, err := personas.NewSimpleChat()
if err != nil {
fmt.Printf("personas.NewSimpleChat error: %v\n", err)
}
Initialize the client with the model you intent on using:
(*persona).Init(interfaces.SkillTypeGeneric, openai.GPT3Dot5Turbo) // openai.GPT3Dot5Turbo is the default or use "" empty string
Ask ChatGPT a question:
prompt := "Hello! How are you doing?"
choices, err := (*persona).Query(ctx, prompt)
if err != nil {
fmt.Printf("persona.Query error: %v\n", err)
// exit!
}
fmt.Printf("Me:\n%s\n", prompt)
fmt.Printf("\n\nChatGPT:\n%s\n", choices[0].Message.Content)
Cumulative Persona
Simple interface for your typical chatbot style (cumulative conversation building chat) client where the context of the conversation (aka the questions and answers) affect or influence the next. Like a real conversation...
This provides a very simple Q&A client.
To create a client:
persona, err := personas.NewCumulativeChat()
if err != nil {
fmt.Printf("personas.NewCumulativeChat error: %v\n", err)
}
Initialize the client with the model you intent on using:
(*persona).Init(interfaces.SkillTypeGeneric, "")
Ask it question:
prompt = "Tell me about Long Beach, CA."
choices, err = (*persona).Query(ctx, prompt)
if err != nil {
fmt.Printf("persona.Query error: %v\n", err)
os.Exit(1)
}
fmt.Printf("Me:\n%s\n", prompt)
fmt.Printf("\n\nChatGPT:\n%s\n", choices[0].Message.Content)
Refine the conversation with more instructions to ChatGPT:
err = (*persona).AddDirective("I want more factual type data")
if err != nil {
fmt.Printf("persona.AddDirective error: %v\n", err)
os.Exit(1)
}
Refine the initial question to get a different response:
prompt = "Now... tell me about Long Beach, CA."
choices, err = (*persona).Query(ctx, prompt)
if err != nil {
fmt.Printf("persona.Query error: %v\n", err)
os.Exit(1)
}
fmt.Printf("Me:\n%s\n", prompt)
fmt.Printf("\n\nChatGPT:\n%s\n", choices[0].Message.Content)
Advanced Persona
Provides more capabilities/functions on the Cumulative
client.
To create a client:
persona, err := personas.NewAdvancedChat()
if err != nil {
fmt.Printf("personas.NewCumulativeChat error: %v\n", err)
}
Ask ChatGPT a question like before:
prompt = "Tell me about Long Beach, CA."
choices, err = (*persona).Query(ctx, openai.ChatMessageRoleUser, prompt)
if err != nil {
fmt.Printf("persona.Query error: %v\n", err)
os.Exit(1)
}
fmt.Printf("Me:\n%s\n", prompt)
fmt.Printf("\n\nChatGPT:\n%s\n", choices[0].Message.Content)
Make a mistake, edit the conversation and regenerate the responses:
fmt.Printf("Oooops... I goofed. I need to edit this...\n\n\n")
conversation, err := (*persona).GetConversation()
if err != nil {
fmt.Printf("persona.GetConversation error: %v\n", err)
os.Exit(1)
}
for pos, msg := range conversation {
if strings.Contains(msg.Content, "Long Beach, CA") {
prompt = "Tell me about Laguna Beach, CA."
choices, err := (*persona).EditConversation(pos, prompt)
if err != nil {
fmt.Printf("persona.EditConversation error: %v\n", err)
os.Exit(1)
}
fmt.Printf("Me:\n%s\n", prompt)
fmt.Printf("\n\nChatGPT:\n%s\n", choices[0].Message.Content)
}
}
Examples
You can find a list of very simple main-style examples to consume this SDK in the examples folder. To run these examples, you need to change directory into an example you wish to run and then execute the go
file in that directory. For example:
$ cd examples/advanced
go run cmd.go
Examples include:
Simple/
- Simple - A very simple Q then A client
- Expert - A very simple Q then A client but posing as an expert
- DAN - Easter egg!
- Cumulative - Simple interface for your typical chatbot style (cumulative conversation building chat) client
- Advanced - Provides more capabilities/functions on the
Cumulative
client
Vanilla/
- Chat Completion - This is a plain vanilla OpenAI client of a Q and A client (just an example)
- Models - This is plain pure vanilla API call to the
GET models
API (just an example)
- Symbl.ai - a real-time streaming Symbl.ai + ChatGPT intergration example
If you have any questions, feel free to contact me on Twitter [@dvonthenen][dvonthenen_twitter] or through our Community Slack.
This SDK is actively developed, and we love to hear from you! Please feel free to create an issue or open a pull request with your questions, comments, suggestions, and feedback. If you liked our integration guide, please star our repo!
This library is released under the Apache 2.0 License