syndicate

package module
v2.0.1-0...-d26fbbc Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 22, 2025 License: Apache-2.0 Imports: 12 Imported by: 0

README

Syndicate AI



Go Report Card GitHub Workflow Status GoDoc Release

Syndicate SDK is a lightweight, flexible, and extensible toolkit for building intelligent conversational agents in Golang. It enables you to create agents, engineer prompts, generate tool schemas, manage memory, and orchestrate complex workflows—making it easy to integrate advanced AI capabilities into your projects. 🚀

Features

  • Agent Management 🤖: Easily build and configure agents with custom system prompts, tools, and memory.
  • Prompt Engineering 📝: Create structured prompts with nested sections for improved clarity.
  • Tool Schemas 🔧: Generate JSON schemas from Go structures to define tools and validate user inputs.
  • Memory Implementations 🧠: Use built-in SimpleMemory or implement your own memory storage that adheres to the Memory interface.
  • Orchestrator 🎛️: Manage multiple agents and execute them in a predefined sequence to achieve complex workflows.
  • Extendable 🔌: The SDK is designed to be unopinionated, allowing seamless integration into your projects.
  • CLI (Planned) 🚀: Future CLI support to initialize projects and scaffold agents with simple commands.

Installation

To install Syndicate SDK, use Go modules:

go get github.com/Dieg0Code/syndicate

Ensure that you have Go installed and configured in your development environment.

Quick Start

Below is a simple example demonstrating how to create a haiku agent using the SDK:

package main

import (
	"context"
	"fmt"
	"log"

	syndicate "github.com/Dieg0Code/syndicate"
	openai "github.com/sashabaranov/go-openai"
)

func main() {
	// Initialize the OpenAI client with your API key.
	client := openai.NewClient("YOUR_OPENAI_API_KEY")

	// Create a simple memory instance.
	memory := syndicate.NewSimpleMemory()

	// Build a structured prompt using PromptBuilder to instruct the agent to respond in haiku.
	systemPrompt := syndicate.NewPromptBuilder(). 
		CreateSection("Introduction"). 
		AddText("Introduction", "You are an agent that always responds in haiku format."). 
		CreateSection("Guidelines"). 
		AddListItem("Guidelines", "Keep responses in a three-line haiku format (5-7-5 syllables)."). 
		AddListItem("Guidelines", "Be creative and concise."). 
		Build()

	fmt.Println("System Prompt:")
	fmt.Println(systemPrompt)

	// Build the agent using AgentBuilder.
	agent, err := syndicate.NewAgentBuilder().
		SetClient(client).
		SetName("HaikuAgent").
		SetConfigPrompt(systemPrompt).
		SetMemory(memory).
		SetModel(openai.GPT4).
		Build()
	if err != nil {
		log.Fatalf("Error building agent: %v", err)
	}

	// Process a sample input with the agent.
	response, err := agent.Process(context.Background(), "What is the weather like today?")
	if err != nil {
		log.Fatalf("Error processing request: %v", err)
	}

	fmt.Println("\nAgent Response:")
	fmt.Println(response)
}
Orchestration Example

Syndicate SDK also provides an orchestrator to manage a sequence of agents. In the following example, two agents are created and executed in sequence:

package main

import (
	"context"
	"fmt"
	"log"

	syndicate "github.com/Dieg0Code/syndicate"
	openai "github.com/sashabaranov/go-openai"
)

func main() {
	// Initialize the OpenAI client using your API key.
	client := openai.NewClient("YOUR_API_KEY")

	// Create simple memory instances for each agent.
	memoryAgentOne := syndicate.NewSimpleMemory()
	memoryAgentTwo := syndicate.NewSimpleMemory()

	// Build the first agent (HelloAgent).
	agentOne, err := syndicate.NewAgentBuilder().
		SetClient(client).
		SetName("HelloAgent").
		SetConfigPrompt("You are an agent that warmly greets users and encourages further interaction.").
		SetMemory(memoryAgentOne).
		SetModel(openai.GPT4).
		Build()
	if err != nil {
		log.Fatalf("Error building HelloAgent: %v", err)
	}

	// Build the second agent (FinalAgent).
	agentTwo, err := syndicate.NewAgentBuilder().
		SetClient(client).
		SetName("FinalAgent").
		SetConfigPrompt("You are an agent that provides a final summary based on the conversation.").
		SetMemory(memoryAgentTwo).
		SetModel(openai.GPT4).
		Build()
	if err != nil {
		log.Fatalf("Error building FinalAgent: %v", err)
	}

	// Create an orchestrator, register both agents, and define the execution sequence.
	orchestrator := syndicate.NewOrchestratorBuilder().
		AddAgent(agentOne).
		AddAgent(agentTwo).
		// Define the processing sequence: first HelloAgent, then FinalAgent.
		SetSequence([]string{"HelloAgent", "FinalAgent"}).
		Build()

	// Provide an input and process the sequence.
	input := "Please greet the user and provide a summary."
	response, err := orchestrator.ProcessSequence(context.Background(), input)
	if err != nil {
		log.Fatalf("Error processing sequence: %v", err)
	}

	fmt.Println("Final Orchestrator Response:")
	fmt.Println(response)
}
Tools

Syndicate SDK includes functionality to automatically generate JSON schemas from Go structures. These generated schemas can be used to define and validate tools for your agents.

For example, consider the following tool definition that generates a JSON schema for a Product structure:

package main

import (
	"encoding/json"
	"fmt"
	"log"

	syndicate "github.com/Dieg0Code/syndicate"
)

// Product represents a product with various attributes.
type Product struct {
	ID        int     `json:"id" description:"Unique product identifier" required:"true"`
	Name      string  `json:"name" description:"Product name" required:"true"`
	Category  string  `json:"category" description:"Category of the product" enum:"Electronic,Furniture,Clothing"`
	Price     float64 `json:"price" description:"Price of the product"`
	Available bool    `json:"available" description:"Product availability" required:"true"`
}

func main() {
	// Generate the JSON schema for the Product struct.
	schema, err := syndicate.GenerateSchema(Product{})
	if err != nil {
		log.Fatal(err)
	}
	output, err := json.MarshalIndent(schema, "", "  ")
	if err != nil {
		log.Fatal(err)
	}
	fmt.Println(string(output))
}

The schema generation leverages reflection along with custom struct tags (e.g., description, required, enum) to produce a JSON Schema that describes the tool's expected input. This schema can then be used to interface with language models or validate user-provided data.

Dependencies and Their Licenses

This project uses the following third-party libraries:

Please refer to their respective repositories for the full license texts.

Contributing

Contributions are welcome! Feel free to open issues or submit pull requests on GitHub.

Documentation

Overview

Package syndicate provides an SDK for interfacing with OpenAI's API, offering agents that process inputs, manage tool execution, and maintain memory.

Index

Constants

View Source
const (
	RoleSystem    = "system"
	RoleDeveloper = "developer"
	RoleUser      = "user"
	RoleAssistant = "assistant"
	RoleTool      = "tool"
)

Role constants define standard message roles across different providers

View Source
const (
	FinishReasonStop      = "stop"
	FinishReasonLength    = "length"
	FinishReasonToolCalls = "tool_calls"
)

FinishReason constants define standard reasons for completion.

Variables

This section is empty.

Functions

func GenerateRawSchema

func GenerateRawSchema(v any) (json.RawMessage, error)

GenerateRawSchema wraps GenerateSchema and returns the JSON marshalled schema. Before marshalling, it validates the generated schema using ValidateDefinition.

func ValidateDefinition

func ValidateDefinition(def *Definition) error

ValidateDefinition recursively validates the generated JSON Schema definition. It ensures that required fields exist, arrays have items defined, that enum values are not empty, and that if AdditionalProperties is set, it conforms to accepted types (bool, Definition, or *Definition).

Types

type Agent

type Agent interface {
	Process(ctx context.Context, userName string, input string, additionalMessages ...[]Message) (string, error)
	AddTool(tool Tool)
	SetConfigPrompt(prompt string)
	GetName() string
}

Agent defines the interface for processing inputs and managing tools. Implementations of Agent should support processing messages, adding tools, configuring prompts, and providing a name identifier.

type AgentBuilder

type AgentBuilder struct {
	// contains filtered or unexported fields
}

AgentBuilder provides a fluent, modular way to configure and construct an Agent.

func NewAgentBuilder

func NewAgentBuilder() *AgentBuilder

NewAgentBuilder initializes and returns a new instance of AgentBuilder.

func (*AgentBuilder) AddTool

func (b *AgentBuilder) AddTool(tool Tool) *AgentBuilder

AddTool adds a tool to the agent's configuration, making it available during processing.

func (*AgentBuilder) Build

func (b *AgentBuilder) Build() (*BaseAgent, error)

Build constructs and returns an Agent based on the current configuration. It returns an error if any issues occurred during the builder setup.

func (*AgentBuilder) SetClient

func (b *AgentBuilder) SetClient(client LLMClient) *AgentBuilder

SetClient sets the LLMClient to be used by the agent.

func (*AgentBuilder) SetConfigPrompt

func (b *AgentBuilder) SetConfigPrompt(prompt string) *AgentBuilder

SetConfigPrompt sets the system prompt that configures the agent's behavior.

func (*AgentBuilder) SetJSONResponseFormat

func (b *AgentBuilder) SetJSONResponseFormat(schemaName string, structSchema any) *AgentBuilder

SetJSONResponseFormat configures the agent to use a JSON schema for response formatting, generating the schema from a provided sample type.

func (*AgentBuilder) SetMemory

func (b *AgentBuilder) SetMemory(memory Memory) *AgentBuilder

SetMemory sets the memory implementation for the agent.

func (*AgentBuilder) SetModel

func (b *AgentBuilder) SetModel(model string) *AgentBuilder

SetModel configures the model to be used by the agent.

func (*AgentBuilder) SetName

func (b *AgentBuilder) SetName(name string) *AgentBuilder

SetName sets the name identifier for the agent.

func (*AgentBuilder) SetTemperature

func (b *AgentBuilder) SetTemperature(temperature float32) *AgentBuilder

SetTemperature sets the temperature parameter for the agent's responses.

type BaseAgent

type BaseAgent struct {
	// contains filtered or unexported fields
}

BaseAgent holds the common implementation of the Agent interface, including the OpenAI client, system prompt, tools, memory, model configuration, and concurrency control.

func (*BaseAgent) AddTool

func (b *BaseAgent) AddTool(tool Tool)

AddTool adds a tool to the agent. AddTool adds a tool to the agent.

func (*BaseAgent) GetName

func (b *BaseAgent) GetName() string

GetName returns the name identifier of the agent.

func (*BaseAgent) Process

func (b *BaseAgent) Process(ctx context.Context, userName, input string, additionalMessages ...[]Message) (string, error)

Process takes the user input along with optional additional messages, adds the initial user message to memory, and initiates processing with available tools.

func (*BaseAgent) SetConfigPrompt

func (b *BaseAgent) SetConfigPrompt(prompt string)

SetConfigPrompt sets the system prompt for the agent, which can be used to configure behavior.

type ChatCompletionRequest

type ChatCompletionRequest struct {
	Model          string           // The model identifier to use (e.g. "gpt-4").
	Messages       []Message        // Conversational messages.
	Tools          []ToolDefinition // Optional tools available to the agent.
	Temperature    float32          // Sampling temperature.
	ResponseFormat *ResponseFormat  // Optional format for the response.
}

ChatCompletionRequest represents a unified chat completion request.

type ChatCompletionResponse

type ChatCompletionResponse struct {
	Choices []Choice // One or more response choices.
	Usage   Usage    // Token usage statistics.
}

ChatCompletionResponse represents a unified response structure from the LLM.

type Choice

type Choice struct {
	Message      Message // The message produced by the LLM.
	FinishReason string  // Reason why generation stopped (e.g. "stop", "length").
}

Choice represents a single completion option.

type DataType

type DataType string

DataType represents a JSON data type in the generated schema.

const (
	Object  DataType = "object"
	Number  DataType = "number"
	Integer DataType = "integer"
	String  DataType = "string"
	Array   DataType = "array"
	Null    DataType = "null"
	Boolean DataType = "boolean"
)

Supported JSON data types.

type DeepseekR1Client

type DeepseekR1Client struct {
	// contains filtered or unexported fields
}

DeepseekR1Client implementa LLMClient usando el SDK de DeepseekR1.

func (*DeepseekR1Client) CreateChatCompletion

CreateChatCompletion envía la solicitud de chat a DeepseekR1. Se ignoran tools y ResponseFormat, ya que DeepseekR1 no los soporta.

type Definition

type Definition struct {
	Type                 DataType              `json:"type,omitempty"`
	Description          string                `json:"description,omitempty"`
	Enum                 []string              `json:"enum,omitempty"`
	Properties           map[string]Definition `json:"properties,omitempty"`
	Required             []string              `json:"required,omitempty"`
	Items                *Definition           `json:"items,omitempty"`
	AdditionalProperties any                   `json:"additionalProperties,omitempty"`
}

Definition is a struct for describing a JSON Schema. It includes type, description, enumeration values, properties, required fields, and additional items.

func (*Definition) MarshalJSON

func (d *Definition) MarshalJSON() ([]byte, error)

MarshalJSON provides custom JSON marshalling for the Definition type. It ensures that the Properties map is initialized before marshalling.

type Embedder

type Embedder struct {
	// contains filtered or unexported fields
}

Embedder is responsible for generating embeddings using the OpenAI API.

func (*Embedder) GenerateEmbedding

func (e *Embedder) GenerateEmbedding(ctx context.Context, data string, model ...openai.EmbeddingModel) ([]float32, error)

GenerateEmbedding generates an embedding for the provided data string. It accepts an optional embedding model; if provided, that model overrides the default. Returns a slice of float32 representing the embedding vector or an error if any.

type EmbedderBuilder

type EmbedderBuilder struct {
	// contains filtered or unexported fields
}

EmbedderBuilder provides a fluent API to configure and build an Embedder instance.

func NewEmbedderBuilder

func NewEmbedderBuilder() *EmbedderBuilder

NewEmbedderBuilder initializes a new EmbedderBuilder with default settings.

func (*EmbedderBuilder) Build

func (b *EmbedderBuilder) Build() (*Embedder, error)

Build constructs the Embedder instance based on the current configuration. Returns an error if the required OpenAI client is not configured.

func (*EmbedderBuilder) SetClient

func (b *EmbedderBuilder) SetClient(client *openai.Client) *EmbedderBuilder

SetClient configures the OpenAI client for the Embedder.

func (*EmbedderBuilder) SetModel

SetModel configures the embedding model to be used by the Embedder.

type JSONSchema

type JSONSchema struct {
	Name   string
	Schema json.RawMessage // The raw JSON schema.
	Strict bool            // Indicates whether strict validation is enforced.
}

JSONSchema defines the structure for responses in JSON.

type LLMClient

type LLMClient interface {
	CreateChatCompletion(ctx context.Context, req ChatCompletionRequest) (ChatCompletionResponse, error)
}

LLMClient defines the interface for interacting with LLM providers.

func NewDeepseekR1Client

func NewDeepseekR1Client(apiKey, baseURL string) LLMClient

NewDeepseekR1Client crea un nuevo cliente para DeepseekR1. Recibe la API key y el baseURL (por ejemplo, "https://models.inference.ai.azure.com/").

func NewOpenAIAzureClient

func NewOpenAIAzureClient(apiKey, endpoint string) LLMClient

NewOpenAIAzureClient creates an LLMClient for Azure using the provided API key and endpoint. It configures the client with Azure-specific settings.

func NewOpenAIClient

func NewOpenAIClient(apiKey string) LLMClient

NewOpenAIClient creates a new LLMClient using the provided API key with the standard OpenAI endpoint.

type Memory

type Memory interface {
	// Add appends a complete ChatCompletionMessage to the memory.
	Add(message Message)
	// Get returns a copy of all stored chat messages.
	Get() []Message
	// Clear removes all stored chat messages from memory.
	Clear()
}

Memory defines the interface for managing a history of chat messages. It provides methods for adding messages, retrieving the complete history, and clearing the history.

func NewSimpleMemory

func NewSimpleMemory() Memory

NewSimpleMemory creates and returns a new instance of SimpleMemory. It initializes the internal message slice and ensures the memory is ready for use.

type Message

type Message struct {
	Role      string     // One of RoleSystem, RoleUser, RoleAssistant, or RoleTool.
	Content   string     // The textual content of the message.
	Name      string     // Optional identifier for the sender.
	ToolCalls []ToolCall // Optional tool calls made by the assistant.
	ToolID    string     // For tool responses, references the original tool call.
}

Message represents a chat message with standardized fields.

type OpenAIClient

type OpenAIClient struct {
	// contains filtered or unexported fields
}

OpenAIClient implements the LLMClient interface using the OpenAI SDK. It wraps the official OpenAI client and provides a consistent interface for making chat completion requests.

func (*OpenAIClient) CreateChatCompletion

func (o *OpenAIClient) CreateChatCompletion(ctx context.Context, req ChatCompletionRequest) (ChatCompletionResponse, error)

CreateChatCompletion sends a chat completion request to the OpenAI API using the provided request parameters. It converts internal messages and tool definitions to OpenAI formats, sends the request, and maps the response back into the SDK's unified structure.

type Orchestrator

type Orchestrator struct {
	// contains filtered or unexported fields
}

Orchestrator manages multiple agents, maintains a global conversation history, and optionally defines an execution sequence (pipeline) for agents.

func NewOrchestrator

func NewOrchestrator() *Orchestrator

NewOrchestrator creates and returns a new Orchestrator with default settings. It initializes the agents map and the global history with a simple in-memory implementation.

func (*Orchestrator) GetAgent

func (o *Orchestrator) GetAgent(name string) (Agent, bool)

GetAgent retrieves a registered agent by its name in a thread-safe manner.

func (*Orchestrator) Process

func (o *Orchestrator) Process(ctx context.Context, agentName, userName, input string) (string, error)

Process executes a specific agent by combining the global history with the agent's internal memory. It retrieves the target agent, merges global messages with the agent's own messages (if applicable), processes the input, and then updates the global history with both the user input and the agent's response.

func (*Orchestrator) ProcessSequence

func (o *Orchestrator) ProcessSequence(ctx context.Context, userName, input string) (string, error)

ProcessSequence executes a pipeline of agents as defined in the orchestrator's sequence. The output of each agent is used as the input for the next agent in the sequence. It returns the final output from the last agent or an error if processing fails.

type OrchestratorBuilder

type OrchestratorBuilder struct {
	// contains filtered or unexported fields
}

OrchestratorBuilder provides a fluent interface for constructing an Orchestrator. It allows developers to configure agents, set a custom global history, and define an execution sequence.

func NewOrchestratorBuilder

func NewOrchestratorBuilder() *OrchestratorBuilder

NewOrchestratorBuilder initializes a new OrchestratorBuilder with default values. It creates an empty agents map, an empty sequence, and a default global history.

func (*OrchestratorBuilder) AddAgent

func (b *OrchestratorBuilder) AddAgent(agent Agent) *OrchestratorBuilder

AddAgent registers an agent with the orchestrator using the agent's name as the key.

func (*OrchestratorBuilder) Build

func (b *OrchestratorBuilder) Build() *Orchestrator

Build constructs and returns an Orchestrator instance based on the current configuration.

func (*OrchestratorBuilder) SetGlobalHistory

func (b *OrchestratorBuilder) SetGlobalHistory(history Memory) *OrchestratorBuilder

SetGlobalHistory sets a custom global conversation history for the orchestrator.

func (*OrchestratorBuilder) SetSequence

func (b *OrchestratorBuilder) SetSequence(seq []string) *OrchestratorBuilder

SetSequence defines the execution order (pipeline) of agents. For example: []string{"agent1", "agent2", "agent3"}.

type PromptBuilder

type PromptBuilder struct {
	// contains filtered or unexported fields
}

PromptBuilder facilitates the construction of a prompt by organizing content into sections and subsections.

func NewPromptBuilder

func NewPromptBuilder() *PromptBuilder

NewPromptBuilder creates and initializes a new PromptBuilder instance.

func (*PromptBuilder) AddListItem

func (pb *PromptBuilder) AddListItem(sectionName, item string) *PromptBuilder

AddListItem adds a numbered list item to the specified section or subsection. The item is trimmed for any unnecessary whitespace.

func (*PromptBuilder) AddListItemF

func (pb *PromptBuilder) AddListItemF(sectionName string, value interface{}) *PromptBuilder

AddListItemF is a helper method that converts any value to its string representation (using JSON marshaling if necessary) and adds it as a numbered list item to the specified section.

func (*PromptBuilder) AddSubSection

func (pb *PromptBuilder) AddSubSection(childName, parentName string) *PromptBuilder

AddSubSection creates a new subsection (child) under the specified parent section. If the parent section does not exist, it is created as a top-level section.

func (*PromptBuilder) AddText

func (pb *PromptBuilder) AddText(sectionName, text string) *PromptBuilder

AddText adds a line of text to the specified section or subsection. It trims any extra whitespace before appending.

func (*PromptBuilder) AddTextF

func (pb *PromptBuilder) AddTextF(sectionName string, value interface{}) *PromptBuilder

AddTextF is a helper method that converts any value to its string representation (using JSON marshaling if necessary) and adds it as a text line to the specified section.

func (*PromptBuilder) Build

func (pb *PromptBuilder) Build() string

Build generates the final prompt by concatenating all top-level sections and their nested subsections. It returns the fully formatted prompt as a single string.

func (*PromptBuilder) CreateSection

func (pb *PromptBuilder) CreateSection(name string) *PromptBuilder

CreateSection adds a new top-level section with the given name to the prompt. If a section with the same name already exists, it is not created again.

type ResponseFormat

type ResponseFormat struct {
	Type       string      // For example, "json_schema".
	JSONSchema *JSONSchema // The JSON schema that defines the expected response format.
}

ResponseFormat specifies how the LLM should format its response.

type Section

type Section struct {
	Name        string     // Name of the section.
	Lines       []string   // Lines of text contained in the section.
	SubSections []*Section // Nested subsections within this section.
	// contains filtered or unexported fields
}

Section represents a block of content that may include text lines and nested subsections. It is used by the PromptBuilder to structure and format the final prompt.

type SimpleMemory

type SimpleMemory struct {
	// contains filtered or unexported fields
}

SimpleMemory implements a basic in-memory storage for chat messages. It uses a slice to store messages and a RWMutex for safe concurrent access.

func (*SimpleMemory) Add

func (s *SimpleMemory) Add(message Message)

Add appends a complete chat message to the SimpleMemory.

func (*SimpleMemory) Clear

func (s *SimpleMemory) Clear()

Clear removes all stored messages from the memory.

func (*SimpleMemory) Get

func (s *SimpleMemory) Get() []Message

Get returns a copy of all stored chat messages to avoid data races. A copy of the messages slice is returned to ensure that external modifications do not affect the internal state.

type Tool

type Tool interface {
	GetDefinition() ToolDefinition
	Execute(args json.RawMessage) (interface{}, error)
}

Tool defines the interface for executable tools.

type ToolCall

type ToolCall struct {
	ID   string          // Unique identifier for the tool call.
	Name string          // The name of the tool to be invoked.
	Args json.RawMessage // Arguments for the tool, encoded in JSON.
}

ToolCall represents a tool invocation request.

type ToolDefinition

type ToolDefinition struct {
	Name        string          // Name of the tool.
	Description string          // A short description of what the tool does.
	Parameters  json.RawMessage // JSON Schema defining the parameters for the tool.
}

ToolDefinition describes a tool's capabilities.

type Usage

type Usage struct {
	PromptTokens     int // Number of tokens in the prompt.
	CompletionTokens int // Number of tokens generated by the model.
	TotalTokens      int // Total tokens consumed.
}

Usage provides token usage statistics.

Directories

Path Synopsis
examples

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL