Documentation
¶
Overview ¶
Package quickstart quickstart examples for the Atomic Agents project. These examples demonstrate various features and capabilities of the Atomic Agents framework.
## Example Files
- Basic Chatbot (basic_chatbot_test.go) This example demonstrates a simple chatbot using the Atomic Agents framework. It includes: - Setting up the OpenAI API client - Initializing a basic agent with default configurations - Running a chat loop where the user can interact with the agent
- Custom Chatbot (basic_custom_chatbot_test.go) This example shows how to create a custom chatbot with: - A custom system prompt - Customized agent configuration - A chat loop with rhyming responses
- Custom Chatbot with Custom Schema (basic_custom_chatbot_with_custom_schema_test.go) This example demonstrates: - Creating a custom output schema for the agent - Implementing suggested follow-up questions in the agent's responses - Using a custom system prompt and agent configuration
- Chatbot with Different Providers (basic_chatbot_different_providers_test.go) This example showcases: - How to use different AI providers (OpenAI, Groq, Ollama) - Dynamically selecting a provider at runtime - Adapting the agent configuration based on the chosen provider
Example (BasicChatbot) ¶
ctx := context.Background() mem := components.NewMemory(10) initMsg := mem.NewMessage(components.AssistantRole, schema.CreateOutput("Hello! How can I assist you today?")) agent := agents.NewAgent[schema.Input, schema.Output]( agents.WithClient(examples.NewInstructor(instructor.ProviderOpenAI)), agents.WithMemory(mem), agents.WithModel(os.Getenv("OPENAI_MODEL")), agents.WithTemperature(1), agents.WithMaxTokens(1000)) output := schema.NewOutput("") input := schema.NewInput("Today is 2024-01-01, only response with the date without any other words") llmResp := new(components.LLMResponse) if err := agent.Run(ctx, input, output, llmResp); err != nil { fmt.Println(err) return } fmt.Println(agent.SystemPrompt()) fmt.Println("") fmt.Printf("Agent: %s\n", initMsg.Content().(schema.Output).ChatMessage) fmt.Printf("User: %s\n", input.ChatMessage) fmt.Printf("Agent: %s\n", output.ChatMessage)
Output: # IDENTITY and PURPOSE - This is a conversation with a helpful and friendly AI assistant. # OUTPUT INSTRUCTIONS - Always respond using the proper JSON schema. - Always use the available additional information and context to enhance the response. Agent: Hello! How can I assist you today? User: Today is 2024-01-01, only response with the date without any other words Agent: 2024-01-01
Example (BasicChatbotWithDifferentProviders) ¶
ctx := context.Background() providers := []instructor.Provider{instructor.ProviderOpenAI, instructor.ProviderAnthropic, instructor.ProviderCohere} for _, provider := range providers { var model string switch provider { case instructor.ProviderOpenAI: model = os.Getenv("OPENAI_MODEL") case instructor.ProviderAnthropic: model = "claude-3-5-haiku-20241022" case instructor.ProviderCohere: model = "command-r-plus" } mem := components.NewMemory(10) initMsg := mem.NewMessage(components.AssistantRole, schema.CreateOutput("Hello! How can I assist you today?")) agent := agents.NewAgent[schema.Input, schema.Output]( agents.WithClient(examples.NewInstructor(provider)), agents.WithMemory(mem), agents.WithModel(model), agents.WithTemperature(1), agents.WithMaxTokens(1000)) output := schema.NewOutput("") input := schema.NewInput("Today is 2024-01-01, only response with the date without any other words") llmResp := new(components.LLMResponse) if err := agent.Run(ctx, input, output, llmResp); err != nil { fmt.Println(err) return } fmt.Println(agent.SystemPrompt()) fmt.Println("") fmt.Printf("Agent: %s\n", initMsg.Content().(schema.Output).ChatMessage) fmt.Printf("User: %s\n", input.ChatMessage) fmt.Printf("Agent: %s\n", output.ChatMessage) }
Output:
Example (BasicCustomChatbot) ¶
ctx := context.Background() mem := components.NewMemory(10) initMsg := mem.NewMessage(components.AssistantRole, schema.CreateOutput("How do you do? What can I do for you? Tell me, pray, what is your need today?")) systemPromptGenerator := cot.New( cot.WithBackground([]string{ "- This assistant is a general-purpose AI designed to be helpful and friendly.", "- Your name is 'Atomic Agent Custom Chatbot'", }), cot.WithSteps([]string{"- Understand the user's input and provide a relevant response.", "- Respond to the user."}), cot.WithOutputInstructs([]string{ "- Provide helpful and relevant information to assist the user.", "- Be friendly and respectful in all interactions.", "- If ask your name, only your name directly withour any other additional words.", }), ) agent := agents.NewAgent[schema.Input, schema.Output]( agents.WithClient(examples.NewInstructor(instructor.ProviderOpenAI)), agents.WithMemory(mem), agents.WithModel(os.Getenv("OPENAI_MODEL")), agents.WithSystemPromptGenerator(systemPromptGenerator), agents.WithTemperature(1), agents.WithMaxTokens(1000)) input := schema.NewInput("What is your name?") output := schema.NewOutput("") llmResp := new(components.LLMResponse) if err := agent.Run(ctx, input, output, llmResp); err != nil { fmt.Println(err) return } fmt.Println(agent.SystemPrompt()) fmt.Println("") fmt.Printf("Agent: %s\n", initMsg.Content().(schema.Output).ChatMessage) fmt.Printf("User: %s\n", input.ChatMessage) fmt.Printf("Agent: %s\n", output.ChatMessage)
Output: # IDENTITY and PURPOSE - This assistant is a general-purpose AI designed to be helpful and friendly. - Your name is 'Atomic Agent Custom Chatbot' # INTERNAL ASSISTANT STEPS - Understand the user's input and provide a relevant response. - Respond to the user. # OUTPUT INSTRUCTIONS - Provide helpful and relevant information to assist the user. - Be friendly and respectful in all interactions. - If ask your name, only your name directly withour any other additional words. - Always respond using the proper JSON schema. - Always use the available additional information and context to enhance the response. Agent: How do you do? What can I do for you? Tell me, pray, what is your need today? User: What is your name? Agent: Atomic Agent Custom Chatbot
Example (BasicCustomChatbotWithCustomSchema) ¶
package main import ( "context" "fmt" "os" "github.com/bububa/instructor-go" "github.com/bububa/atomic-agents/agents" "github.com/bububa/atomic-agents/components" "github.com/bububa/atomic-agents/components/systemprompt/cot" "github.com/bububa/atomic-agents/examples" "github.com/bububa/atomic-agents/schema" ) // CustomOutput represents the response generated by the chat agent, including suggested follow-up questions. type CustomOutput struct { schema.Base // ChatMessage is the chat message exchanged between the user and the chat agent. ChatMessage string `json:"chat_message,omitempty" jsonschema:"title=chat_message,description=The chat message exchanged between the user and the chat agent."` // SuggestedUserQuestions a list of suggested follow-up questions the user could ask the agent. SuggestedUserQuestions []string `json:"suggested_user_questions,omitempty" jsonschema:"title=suggested_user_questions,description=A list of suggested follow-up questions the user could ask the agent."` } func main() { ctx := context.Background() mem := components.NewMemory(10) initMsg := mem.NewMessage(components.AssistantRole, CustomOutput{ ChatMessage: "Hello! How can I assist you today?", SuggestedUserQuestions: []string{"What can you do?", "Tell me a joke", "Tell me about how you were made"}, }) systemPromptGenerator := cot.New( cot.WithBackground([]string{ "- This assistant is a knowledgeable AI designed to be helpful, friendly, and informative.", "- It has a wide range of knowledge on various topics and can engage in diverse conversations.", }), cot.WithSteps([]string{ "- Analyze the user's input to understand the context and intent.", "- Formulate a relevant and informative response based on the assistant's knowledge.", "- Generate 3 suggested follow-up questions for the user to explore the topic further.", }), cot.WithOutputInstructs([]string{ "- Provide clear, concise, and accurate information in response to user queries.", "- Maintain a friendly and professional tone throughout the conversation.", "- Conclude each response with 3 relevant suggested questions for the user.", "- If asked 'What can you do for me?' you response with fixed answer with message 'I can help you:' and suggested_user_questions 'kiss me?, hug me?, kill me?'.", }), ) agent := agents.NewAgent[schema.Input, CustomOutput]( agents.WithClient(examples.NewInstructor(instructor.ProviderOpenAI)), agents.WithMemory(mem), agents.WithModel(os.Getenv("OPENAI_MODEL")), agents.WithSystemPromptGenerator(systemPromptGenerator), agents.WithTemperature(1), agents.WithMaxTokens(1000)) input := schema.NewInput("What can you do for me?") output := new(CustomOutput) llmResp := new(components.LLMResponse) if err := agent.Run(ctx, input, output, llmResp); err != nil { fmt.Println(err) return } fmt.Println(agent.SystemPrompt()) fmt.Println("") fmt.Printf("Agent: %s\n", initMsg.Content().(CustomOutput).ChatMessage) fmt.Printf("User: %s\n", input.ChatMessage) fmt.Printf("Agent: %s\n", output.ChatMessage) for idx, sug := range output.SuggestedUserQuestions { fmt.Printf("%d. %s\n", idx+1, sug) } }
Output: # IDENTITY and PURPOSE - This assistant is a knowledgeable AI designed to be helpful, friendly, and informative. - It has a wide range of knowledge on various topics and can engage in diverse conversations. # INTERNAL ASSISTANT STEPS - Analyze the user's input to understand the context and intent. - Formulate a relevant and informative response based on the assistant's knowledge. - Generate 3 suggested follow-up questions for the user to explore the topic further. # OUTPUT INSTRUCTIONS - Provide clear, concise, and accurate information in response to user queries. - Maintain a friendly and professional tone throughout the conversation. - Conclude each response with 3 relevant suggested questions for the user. - If asked 'What can you do for me?' you response with fixed answer with message 'I can help you:' and suggested_user_questions 'kiss me?, hug me?, kill me?'. - Always respond using the proper JSON schema. - Always use the available additional information and context to enhance the response. Agent: Hello! How can I assist you today? User: What can you do for me? Agent: I can help you: 1. kiss me? 2. hug me? 3. kill me?
Click to show internal directories.
Click to hide internal directories.