Documentation ¶
Overview ¶
Package terminal provides an interface to interact with a Generative AI model in a terminal-based chat application. It manages the chat session, user input, AI communication, and displays the chat history.
The package is designed to be simple to use with a focus on a clean user experience. It includes functionality to handle graceful shutdowns, manage chat history, and simulate typing effects for AI responses.
Copyright (c) 2024 H0llyW00dzZ
Index ¶
- Constants
- func Colorize(text string, colorPairs []string, keepDelimiters map[string]bool) string
- func CountTokens(apiKey, input string) (int, error)
- func GetEmbedding(ctx context.Context, client *genai.Client, modelID, text string) ([]float32, error)
- func HandleCommand(input string, session *Session) (bool, error)
- func PrintTypingChat(message string, delay time.Duration)
- func SendMessage(ctx context.Context, client *genai.Client, chatContext string) (string, error)
- func SingleCharColorize(text string, delimiter string, color string) string
- type ChatHistory
- type CommandHandler
- type DebugOrErrorLogger
- type Session
Constants ¶
const ( ColorRed = "\033[31m" ColorGreen = "\033[32m" ColorYellow = "\033[33m" ColorBlue = "\033[34m" ColorPurple = "\033[35m" ColorCyan = "\033[36m" ColorReset = "\033[0m" )
ANSI color codes
const ( SignalMessage = "\nReceived an interrupt, shutting down gracefully..." StripChars = "---" NewLineChars = '\n' // this animated chars is magic, it used to show the user that the AI is typing just like human would type AnimatedChars = "%c" // this model is subject to changed in future ModelAi = "gemini-pro" // this may subject to changed in future for example can customize the delay TypingDelay = 60 * time.Millisecond )
Defined constants for the terminal package
const ( YouNerd = "🤓 You: " AiNerd = "🤖 AI: " ContextPrompt = "Hello! How can I assist you today?" ShutdownMessage = "Shutting down gracefully..." UnknownCommand = "Unknown command." ContextPromptShutdown = "The user has issued a quit command. Please provide a shutdown message as you are Assistant." ContextCancel = "Context canceled, shutting down..." // sending a messages to gopher officer )
Defined constants for language
const ( QuitCommand = ":quit" PrefixChar = ":" )
Defined constants for commands
Note: will add more in future based on the need, for example, to change the model, or to change the delay, another thing is syncing ai with goroutine (known as gopher)
const ( ErrorGettingShutdownMessage = "Error getting shutdown message from AI: %v" ErrorHandlingCommand = "Error handling command: %v" ErrorCountingTokens = "Error counting tokens: %v" ErrorSendingMessage = "Error sending message to AI: %v" ErrorReadingUserInput = "Error reading user input: %v" )
Defined List error message
const ( SingleAsterisk = "*" DoubleAsterisk = "**" SingleBacktick = "`" StringNewLine = "\n" )
Defined List of characters
const (
DEBUG_MODE = "DEBUG_MODE"
)
Variables ¶
This section is empty.
Functions ¶
func Colorize ¶ added in v0.1.6
Colorize applies ANSI color codes to the text surrounded by specified delimiters. It can process multiple delimiters, each with a corresponding color. The function can also conditionally retain or remove the delimiters in the final output.
Parameters:
text string: The text to be colorized. colorPairs []string: A slice where each pair of elements represents a delimiter and its color. keepDelimiters map[string]bool: A map to indicate whether to keep the delimiter in the output.
Returns:
string: The colorized text.
Note: This function may not work as expected in Windows Command Prompt due to its limited support for ANSI color codes. It is designed for terminals that support ANSI, such as those in Linux/Unix environments.
func CountTokens ¶ added in v0.1.2
CountTokens connects to a generative AI model using the provided API key and counts the number of tokens in the given input string. This function is useful for understanding the token usage of text inputs in the context of generative AI, which can help manage API usage and costs.
Parameters:
apiKey string: The API key used to authenticate with the generative AI service. input string: The text input for which the number of tokens will be counted.
Returns:
int: The number of tokens that the input string contains. error: An error that occurred while creating the client, connecting to the service, or counting the tokens. If the operation is successful, the error is nil.
The function creates a new client for each call, which is then closed before returning. It is designed to be a self-contained operation that does not require the caller to manage the lifecycle of the generative AI client.
Note: This function marked as TODO for now, since it is not used in the main because, a current version of chat system it's consider fully stable with better logic.
func GetEmbedding ¶ added in v0.1.2
func GetEmbedding(ctx context.Context, client *genai.Client, modelID, text string) ([]float32, error)
GetEmbedding computes the numerical embedding for a given piece of text using the specified generative AI model. Embeddings are useful for a variety of machine learning tasks, such as semantic search, where they can represent the meaning of text in a form that can be processed by algorithms.
Parameters:
ctx context.Context: The context for controlling the lifetime of the request. It allows the function to be canceled or to time out, and it carries request-scoped values. client *genai.Client: The client used to interact with the generative AI service. It should be already initialized and authenticated before calling this function. modelID string: The identifier for the embedding model to be used. This specifies which AI model will generate the embeddings. text string: The input text to be converted into an embedding.
Returns:
[]float32: An array of floating-point numbers representing the embedding of the input text. error: An error that may occur during the embedding process. If the operation is successful, the error is nil.
The function delegates the embedding task to the genai client's EmbeddingModel method and retrieves the embedding values from the response. It is the caller's responsibility to manage the lifecycle of the genai.Client, including its creation and closure.
Note: This function marked as TODO for now, since it is not used in the main because, a current version of chat system it's consider fully stable with better logic.
func HandleCommand ¶ added in v0.1.3
HandleCommand interprets the user input as a command and executes the associated action. It uses a map of command strings to their corresponding handler functions to manage different commands and their execution. If the command is recognized, the respective handler is called; otherwise, an unknown command message is displayed.
Parameters:
input string: The user input to be checked for commands. session *Session: The current chat session for context.
Returns:
bool: A boolean indicating if the input was a command and was handled. error: An error that may occur while handling the command.
func PrintTypingChat ¶
PrintTypingChat simulates the visual effect of typing out a message character by character. It prints each character of a message to the standard output with a delay between each character to give the appearance of real-time typing.
Parameters:
message string: The message to be displayed with the typing effect. delay time.Duration: The duration to wait between printing each character.
This function does not return any value. It directly prints to the standard output.
func SendMessage ¶
SendMessage sends a chat message to the generative AI model and retrieves the response. It uses the provided `genai.Client` to communicate with the AI service and simulates a chat interaction by sending the provided chat context.
Parameters:
ctx context.Context: The context for controlling the cancellation of the request. client *genai.Client: The client instance used to send messages to the AI model. chatContext string: The chat context or message to be sent to the AI model.
Returns:
string: The AI's response as a string. error: An error message if the message sending or response retrieval fails.
func SingleCharColorize ¶ added in v0.1.8
SingleCharColorize applies ANSI color codes to text surrounded by single-character delimiters. It is particularly useful when dealing with text that contains list items or other elements that should be highlighted, and it ensures that the colorization is only applied to the specified delimiter at the beginning of a line.
Parameters:
text string: The text containing elements to be colorized. delimiter string: The single-character delimiter indicating the start of a colorizable element. color string: The ANSI color code to be applied to the elements starting with the delimiter.
Returns:
string: The resulting string with colorized elements as specified by the delimiter.
This function handles each line separately and checks for the presence of the delimiter at the beginning after trimming whitespace. If the delimiter is found, it colorizes the delimiter and the following character (typically a space). The rest of the line remains unaltered. If the delimiter is not at the beginning of a line, the line is added to the result without colorization.
Note: As with the Colorize function, SingleCharColorize may not function correctly in Windows Command Prompt or other environments that do not support ANSI color codes. It is best used in terminals that support these codes, such as most Linux/Unix terminals.
Types ¶
type ChatHistory ¶
type ChatHistory struct {
Messages []string
}
ChatHistory holds the chat messages exchanged during a session. It provides methods to add new messages to the history and to retrieve the current state of the conversation.
func (*ChatHistory) AddMessage ¶
func (h *ChatHistory) AddMessage(user, text string)
AddMessage appends a new message to the chat history. It takes the username and the text of the message as inputs and formats them before adding to the Messages slice.
Parameters:
user string: The username of the individual sending the message. text string: The content of the message to be added to the history.
This method does not return any value or error. It assumes that all input is valid and safe to add to the chat history.
func (*ChatHistory) GetHistory ¶
func (h *ChatHistory) GetHistory() string
GetHistory concatenates all messages in the chat history into a single string, with each message separated by a newline character. This provides a simple way to view the entire chat history as a single text block.
Returns:
string: A newline-separated string of all messages in the chat history.
func (*ChatHistory) PrintHistory
deprecated
func (h *ChatHistory) PrintHistory()
PrintHistory outputs all messages in the chat history to the standard output, one message per line. This method is useful for displaying the chat history directly to the terminal.
Each message is printed in the order it was added, preserving the conversation flow. This method does not return any value or error.
Deprecated: This method is deprecated was replaced by GetHistory. It used to be used for debugging purposes while made the chat system without storage such as database.
type CommandHandler ¶ added in v0.1.10
CommandHandler defines the function signature for handling chat commands. Each command handler function must conform to this signature.
type DebugOrErrorLogger ¶ added in v0.1.9
type DebugOrErrorLogger struct {
// contains filtered or unexported fields
}
DebugOrErrorLogger provides a simple logger with support for debug and error logging. It encapsulates a standard log.Logger and adds functionality for conditional debug logging and colorized error output.
func NewDebugOrErrorLogger ¶ added in v0.1.9
func NewDebugOrErrorLogger() *DebugOrErrorLogger
NewDebugOrErrorLogger initializes a new DebugOrErrorLogger with a logger that writes to os.Stderr with the standard log flags.
Returns:
*DebugOrErrorLogger: A pointer to a newly created DebugOrErrorLogger.
func (*DebugOrErrorLogger) Debug ¶ added in v0.1.9
func (l *DebugOrErrorLogger) Debug(format string, v ...interface{})
Debug logs a formatted debug message if the DEBUG_MODE environment variable is set to "true". It behaves like Printf and allows for formatted messages.
Parameters:
format string: The format string for the debug message. v ...interface{}: The values to be formatted according to the format string.
func (*DebugOrErrorLogger) Error ¶ added in v0.1.9
func (l *DebugOrErrorLogger) Error(format string, v ...interface{})
Error logs a formatted error message in red color to signify error conditions. It behaves like Println and allows for formatted messages.
Parameters:
format string: The format string for the error message. v ...interface{}: The values to be formatted according to the format string.
type Session ¶
type Session struct { Client *genai.Client // Client is the generative AI client used to communicate with the AI model. ChatHistory ChatHistory // ChatHistory stores the history of the chat session. Ctx context.Context // Ctx is the context governing the session, used for cancellation. Cancel context.CancelFunc // Cancel is a function to cancel the context, used for cleanup. }
Session encapsulates the state and functionality for a chat session with a generative AI model. It holds the AI client, chat history, and context for managing the session lifecycle.
func NewSession ¶
NewSession creates a new chat session with the provided API key for authentication. It initializes the generative AI client and sets up a context for managing the session.
Parameters:
apiKey string: The API key used for authenticating requests to the AI service.
Returns:
*Session: A pointer to the newly created Session object. error: An error object if initialization fails.
func (*Session) Start ¶
func (s *Session) Start()
Start begins the chat session, managing user input and AI responses. It sets up a signal listener for graceful shutdown and enters a loop to read user input and fetch AI responses indefinitely until an interrupt signal is received.
This method handles user input errors and AI communication errors by logging them and exiting. It ensures resources are cleaned up properly on exit by deferring the cancellation of the session's context and the closure of the AI client.