README
¶
🦜 Parakeet
Parakeet is the simplest Go library to create GenAI apps with Ollma.
A GenAI app is an application that uses generative AI technology. Generative AI can create new text, images, or other content based on what it's been trained on. So a GenAI app could help you write a poem, design a logo, or even compose a song! These are still under development, but they have the potential to be creative tools for many purposes. - Gemini
✋ Parakeet is only for creating GenAI apps generating text (not image, music,...).
Install
go get github.com/parakeet-nest/parakeet
Simple completion
The simple completion can be used to generate a response for a given prompt with a provided model.
package main
import (
"github.com/parakeet-nest/parakeet/completion"
"github.com/parakeet-nest/parakeet/llm"
"fmt"
"log"
)
func main() {
ollamaUrl := "http://localhost:11434"
model := "tinydolphin"
options := llm.Options{
Temperature: 0.5, // default (0.8)
}
question := llm.Query{
Model: model,
Prompt: "Who is James T Kirk?",
Options: options,
}
answer, err := completion.Generate(ollamaUrl, question)
if err != nil {
log.Fatal("😡:", err)
}
fmt.Println(answer.Response)
}
Simple completion with stream
package main
import (
"github.com/parakeet-nest/parakeet/completion"
"github.com/parakeet-nest/parakeet/llm"
"fmt"
"log"
)
func main() {
ollamaUrl := "http://localhost:11434"
model := "tinydolphin"
options := llm.Options{
Temperature: 0.5, // default (0.8)
}
question := llm.Query{
Model: model,
Prompt: "Who is James T Kirk?",
Options: options,
}
answer, err := completion.GenerateStream(ollamaUrl, question,
func(answer llm.Answer) error {
fmt.Print(answer.Response)
return nil
})
if err != nil {
log.Fatal("😡:", err)
}
}
Completion with context
see: https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-completion
The context can be used to keep a short conversational memory for the next completion.
package main
import (
"github.com/parakeet-nest/parakeet/completion"
"github.com/parakeet-nest/parakeet/llm"
"fmt"
"log"
)
func main() {
ollamaUrl := "http://localhost:11434"
model := "tinydolphin"
options := llm.Options{
Temperature: 0.5, // default (0.8)
}
firstQuestion := llm.Query{
Model: model,
Prompt: "Who is James T Kirk?",
Options: options,
}
answer, err := completion.Generate(ollamaUrl, firstQuestion)
if err != nil {
log.Fatal("😡:", err)
}
fmt.Println(answer.Response)
fmt.Println()
secondQuestion := llm.Query{
Model: model,
Prompt: "Who is his best friend?",
Context: answer.Context,
Options: options,
}
answer, err = completion.Generate(ollamaUrl, secondQuestion)
if err != nil {
log.Fatal("😡:", err)
}
fmt.Println(answer.Response)
}
Chat completion
The chat completion can be used to generate a conversational response for a given set of messages with a provided model.
package main
import (
"github.com/parakeet-nest/parakeet/completion"
"github.com/parakeet-nest/parakeet/llm"
"fmt"
"log"
)
func main() {
ollamaUrl := "http://localhost:11434"
model := "deepseek-coder"
systemContent := `You are an expert in computer programming.
Please make friendly answer for the noobs.
Add source code examples if you can.`
userContent := `I need a clear explanation regarding the following question:
Can you create a "hello world" program in Golang?
And, please, be structured with bullet points`
options := llm.Options{
Temperature: 0.5, // default (0.8)
RepeatLastN: 2, // default (64)
RepeatPenalty: 2.0, // default (1.1)
}
query := llm.Query{
Model: model,
Messages: []llm.Message{
{Role: "system", Content: systemContent},
{Role: "user", Content: userContent},
},
Options: options,
Stream: false,
}
answer, err := completion.Chat(ollamaUrl, query)
if err != nil {
log.Fatal("😡:", err)
}
fmt.Println(answer.Message.Content)
}
✋ To keep a conversational memory for the next chat completion, update the list of messages with the previous question and answer.
I plan to add the support of bbolt in the incoming v0.0.1 of Parakeet to store the conversational memory.
Chat completion with stream
package main
import (
"fmt"
"log"
"github.com/parakeet-nest/parakeet/completion"
"github.com/parakeet-nest/parakeet/llm"
)
func main() {
ollamaUrl := "http://localhost:11434"
model := "deepseek-coder"
systemContent := `You are an expert in computer programming.
Please make friendly answer for the noobs.
Add source code examples if you can.`
userContent := `I need a clear explanation regarding the following question:
Can you create a "hello world" program in Golang?
And, please, be structured with bullet points`
options := llm.Options{
Temperature: 0.5, // default (0.8)
RepeatLastN: 2, // default (64)
}
query := llm.Query{
Model: model,
Messages: []llm.Message{
{Role: "system", Content: systemContent},
{Role: "user", Content: userContent},
},
Options: options,
Stream: false,
}
_, err := completion.ChatStream(ollamaUrl, query,
func(answer llm.Answer) error {
fmt.Print(answer.Message.Content)
return nil
})
if err != nil {
log.Fatal("😡:", err)
}
}
Documentation
¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
var About = "🦜 Parakeet v0.0.0 🪺 [nest]"
var Version = "v0.0.0"
Functions ¶
This section is empty.
Types ¶
This section is empty.