darksuitai

package module
v0.0.8 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 25, 2024 License: GPL-3.0 Imports: 13 Imported by: 0

README

🕵️ DarkSuitAI

⚡ Blazing production-ready library for building scalable reasoning AI systems ✨

Release Notes CI GitHub star chart Open Issues Open in Dev Containers Open in GitHub Codespaces

Quick Install

go get github.com/darksuit-ai/darksuitai@latest

🤔 What is DarkSuitAI?

DarkSuitAI is a framework for developing production-ready AI systems powered by large language models (LLMs).

🧱 What can you build with DarkSuitAI?

🧱 Agent Powered AI System

🤖 Chatbots

And much more!

🚀 How does DarkSuitAI bring you straight to production?

The main value props of the DarkSuitAI libraries are:

  1. Components: composable building blocks, tools and integrations for working with language models. Components are modular and easy-to-use, and full scale production-ready for AI systems.
  2. Off-the-shelf chains: built-in assemblages of components for accomplishing higher-level tasks

Off-the-shelf chains make it easy to get started. Components make it easy to customize existing chains and build new ones.

Components

Components fall into the following modules:

📃 Model I/O

This includes prompt management, prompt optimization, a generic interface for chat models, and common utilities for working with model outputs.


package main

import (
	"fmt"
	"github.com/darksuit-ai/darksuitai"
)

func main() {
	// either add apikey to your .env and darksuit picks it up or pass as argument
	if err := godotenv.Load(".env"); err != nil {
		log.Printf("Warning: error loading .env file: %v", err)
	}

	args := darksuitai.NewChatLLMArgs()
	args.AddAPIKey([]byte(`your-api-key`)) // pass LLM API Key
	// args.SetChatInstruction([]byte(`Your chat instruction goes here`)) // uncomment to pass your own prompt instruction
	args.AddPromptKey("year", []byte(`2024`)) // pass variables to your prompt
	args.SetModelType("openai", "gpt-4o") // set the model
	args.AddModelKwargs(500, 0.8, true, []string{"\nObservation:"}) // set model keyword arguments
	llm,err := args.NewLLM()
	if err != nil{
		// handle the error as you wish
	}
	resp,err:=llm.Chat("hello, Sam Ayo from earth🌍. What is your name?")
	if err != nil{
		// handle the error as you wish
	}
	fmt.Println(resp)
	// to stream
	streamText :=llm.Stream("hello from earth🌍, what is your name?")
		for r := range streamText{
			fmt.Println(r)
		}
	}


package main

import (
	"fmt"
	"context"
	"log"
	"github.com/darksuit-ai/darksuitai"
)

func main() {

	if err := godotenv.Load(".env"); err != nil {
		log.Printf("Warning: error loading .env file: %v", err)
	}
	user := "your_db_username"
	password := "your_db_password"
	host := "your_db_host"
	port := "your_db_port"
	databaseName := "your_atabase_name"

	args := darksuitai.NewChatLLMArgs()
	args.AddAPIKey([]byte(`your-api-key`)) // pass LLM API Key
	// Set up the MongoDB connection URL
	url := fmt.Sprintf("mongodb://%s:%s@%s:%s/%s?serverSelectionTimeoutMS=5000&authSource=mongo_staging&directConnection=true", user, password, host, port,databaseName)

	// Connect to the MongoDB database
	ctx := context.Background()
	client, err := mongo.Connect(ctx, options.Client().ApplyURI(url))
	if err != nil {
		log.Fatal(err)
	}
	// Ping the primary
	if err := client.Ping(ctx, nil); err != nil {
		log.Fatal(err)
	}

	// Get a handle to the database and collections
	db := client.Database(databaseName)
	// args.SetChatInstruction([]byte(`Your chat instruction goes here`)) // uncomment to pass your own prompt instruction
	args.AddPromptKey("year", []byte(`2024`)) // pass variables to your prompt
	args.MongoDB(db) // add mongodb client
	args.SetModelType("openai", "gpt-4o") // set the model
	args.AddModelKwargs(500, 0.8, true, []string{"\nObservation:"}) // set model keyword arguments
	llm,err := args.NewLLM()
	if err != nil{
		// handle the error as you wish
	}
	resp,err:=llm.ConvChat("hello, Sam Ayo from earth🌍. What is your name?")
	if err != nil{
		// handle the error as you wish
	}
	fmt.Println(resp)

}

🤖 Agents

Agents allow an LLM autonomy over how a task is accomplished. Agents make decisions about which Actions to take, then take that Action, observe the result, and repeat until the task is complete. DarkSuitAI supercedes all other agentic frameworks/library through it's AI self-reflect action control.

🛠️ Tool

Tools in agentic AI are essential components that allow AI agents to interact with their environment and perform specific tasks. In the context of agentic AI, tools are functions or modules that the AI can utilize to achieve its goals. These tools can range from simple data processing functions to complex algorithms that enable decision-making and problem-solving. By leveraging tools, AI agents can extend their capabilities beyond their core functionalities, allowing them to adapt to various scenarios and challenges. In the DarkSuitAI framework, tools are defined using the NewTool function, which allows developers to create custom tools tailored to specific needs. These tools are then registered in the ToolNodes map, making them accessible to the AI agents for execution during their task completion processes.

Creating a tool and testing it

To create the a Tool, you can use the NewTool function provided by the DarkSuitAI framework. Here's a sample code snippet to create and register the a Tool:

func googleSearch(query string)(string,[]interface{}, error){
	// your logic
}
testingTool := tsk.NewTool(
    "google search", // tool name
    "this tool is useful for performing web search using Google.", // tool description
    func(query string, metaData []interface{}) (string, []interface{}, error) {
        return googleSearch(query string)
    },
)

// Register the google search Tool in the ToolNodes map
tsk.ToolNodes["google search"] = testingTool
result,toolMeta,_:=tsk.ToolNodes["google search"].ToolFunc("about the US",nil)
fmt.Printf("%s,%v",result,toolMeta)

Building an agent

package main

import (
	"fmt"
	"context"
	"log"
	"github.com/darksuit-ai/darksuitai"
)

func main() {
	databaseName := "your_database_name"
	databaseURL := "your_database_url either mongodb+srv:// or mongodb://"
	db := NewMongoChatMemory(data,databaseName) // Get the database pointer

	weatherReportTool :=darksuitai.NewTool(
		"weather report",
		"",
		func(query, toolName string, metaData map[string]interface{}) (string, []interface{},error) {
			// your API call to weather API like openweather
			rawWeatherResultFromAPI := `{"location": "San Francisco", "weather": "sunny", "high": "68°F", "low": "54°F"}`
			return "The weather in San Francisco is sunny with a high of 68°F and a low of 54°F.", []interface{}{rawWeatherResultFromAPI}, nil
		},
	)

	darksuitai.ToolNodes = append(darksuitai.ToolNodes, weatherReportTool)

	args := darksuitai.NewChatLLMArgs()
	args.AddAPIKey([]byte(`your-api-key`)) // pass LLM API Key

	// args.SetChatInstruction([]byte(`Your chat instruction goes here`)) // uncomment to pass your own prompt instruction
	args.SetMongoDBChatMemory(db) // set the database
	args.AddPromptKey("year", []byte(`2024`)) // pass variables to your prompt
	args.SetModelType("openai", "gpt-4o") // set the model
	args.AddModelKwargs(1000, 0.8, true, []string{"\nObservation:"}) // set model keyword arguments

	agent,err := args.NewSuitedAgent()
	if err != nil{
		// handle the error as you wish
	}
	err = agent.Program(3,"your-session-id",true)
	if err != nil{
		// handle the error as you wish
	}
	resp,_,err:=agent.Chat("hello what is the current weather?")
	if err != nil{
		// handle the error as you wish
	}
	fmt.Println(resp)

	// To stream

	streamChan, err := agent.Stream("what is the current weather")
	if err != nil {
		print(err.Error())
	}

	for chunk := range streamChan {
		// Process each chunk as it arrives
		fmt.Println(">>", chunk)
	}

}

💁 Contributing

As an open-source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infrastructure, or better documentation.

Documentation

Index

Constants

This section is empty.

Variables

View Source
var GoogleSearch = tools.GoogleTool

GoogleSearch is a premade tool provided by the framework from the tools package.

View Source
var ToolNodes = tools.ToolNodes

ToolNodes is a slice that holds all registered tools, allowing them to be accessed by their indices.

View Source
var ToolNodesMeta = tools.ToolNodesMeta

ToolNodesMeta is a variable that holds metadata for all registered tools. This is useful when you need to pass extra data to the logic of a tool from other systems

Functions

func NewMongoChatMemory added in v0.0.4

func NewMongoChatMemory(databaseURI, databaseName string) *mongo.Collection

func NewStreamWriter added in v0.0.4

func NewStreamWriter() *_stream.StreamWriter

func NewTool added in v0.0.4

func NewTool(name string, description string, toolFunc func(string, string, map[string]interface{}) (string, []interface{}, error)) tools.BaseTool

NewTool creates a new instance of BaseTool with the specified name, description, and tool function. The tool function is a callback that takes a string and a slice of interfaces as input and returns a string and a slice of interfaces.

Example usage:

myTool := darksuitAI.NewTool("exampleTool", "This is an example tool",
		func(input string,string, data []interface{}) (string, []interface{},error) {
			// Your tool logic here
			return input, data, nil
		})

darksuitAI.ToolNodes = append(darksuitAI.ToolNodes,myTool)

fmt.Printf("all tools created: %v",darksuitAI.ToolNodes) // to see all your tools

Types

type AgentSynapse added in v0.0.4

type AgentSynapse struct {
	// contains filtered or unexported fields
}

func (*AgentSynapse) Chat added in v0.0.4

func (a *AgentSynapse) Chat(input string) (string, any, error)

Chat processes the input query and returns the response. It optionally triggers a callback if verbose mode is enabled.

Parameters:

  • input: The user's input query as a string.
  • sessionId: A string representing the session identifier.

Returns:

  • A string containing the response from the agent.
  • An interface containing any additional tool data.
  • An error if the execution fails.

func (*AgentSynapse) Program added in v0.0.4

func (a *AgentSynapse) Program(maxIteration int, sessionId string, verbose bool) error

func (*AgentSynapse) Stream added in v0.0.4

func (a *AgentSynapse) Stream(input string) (<-chan string, error)

type ConvLLM

type ConvLLM struct {
	// contains filtered or unexported fields
}

func (*ConvLLM) Chat

func (d *ConvLLM) Chat(prompt string) (string, error)

Chat ConvLLM exposes the LLM method for conversational chat

func (*ConvLLM) Stream

func (d *ConvLLM) Stream(prompt string) <-chan string

Stream ConvLLM exposes the LLM method for conversational chat stream

type LLM

type LLM struct {
	// contains filtered or unexported fields
}

func (*LLM) Chat

func (d *LLM) Chat(prompt string) (string, error)

Chat LLM exposes the LLM method for chat

func (*LLM) Stream

func (d *LLM) Stream(prompt string) <-chan string

Stream LLM exposes the LLM method for chat stream

type LLMArgs added in v0.0.4

type LLMArgs types.LLMArgs

DarkSuitAI is the main struct that users will interact with

func NewLLMArgs added in v0.0.4

func NewLLMArgs() *LLMArgs

NewLLMArgs creates a new LLMArgs with default values

func (*LLMArgs) AddAPIKey added in v0.0.4

func (args *LLMArgs) AddAPIKey(apiKey []byte)
AddAPIKey sets the API key for the LLMArgs instance.

This method allows you to securely store the API key required for authenticating requests to the chat model service.

Example:

args := darksuitAI.NewLLMArgs()

args.AddAPIKey([]byte("your-api-key"))

In this example, the byte slice containing the API key is set, enabling the chat model to authenticate and process requests.

func (*LLMArgs) AddModelKwargs added in v0.0.4

func (args *LLMArgs) AddModelKwargs(maxTokens int, temperature float64, stream bool, stopSequences []string)
AddModelKwargs adds a new set of model arguments to the ModelKwargs slice in LLMArgs.

This method allows you to specify various parameters for the model's behavior.

Example:

args := darksuitAI.NewLLMArgs()

args.AddModelKwargs(500, 0.8, true, []string{"Human:"})

In this example, the model arguments are set with a maximum of 1500 tokens, a temperature of 0.8, streaming enabled, and a stop sequence of "Human:".

func (*LLMArgs) AddPromptKey added in v0.0.4

func (args *LLMArgs) AddPromptKey(key string, value []byte)
AddPromptKey adds a key-value pair to the PromptKeys map in LLMArgs.

This method allows you to dynamically insert or update prompt-specific variables that can be used within the chat instruction template.

Example:

args := darksuitAI.NewLLMArgs()

args.AddPromptKey("year", []byte(`2024`))

args.AddPromptKey("month", []byte(`June`))

In this example, the keys "year" and "month" with their respective values "2024" and "June" are added to the PromptKeys map, which can then be referenced in the chat instruction template.

func (*LLMArgs) NewConvLLM added in v0.0.4

func (cargs *LLMArgs) NewConvLLM() (*ConvLLM, error)

NewConvLLM creates a new instance of DarkSuitAI LLM

func (*LLMArgs) NewLLM added in v0.0.4

func (cargs *LLMArgs) NewLLM() (*LLM, error)

NewLLM creates a new instance of DarkSuitAI LLM

func (*LLMArgs) NewSuitedAgent added in v0.0.4

func (cargs *LLMArgs) NewSuitedAgent() (*AgentSynapse, error)

NewSuitedAgent creates a new instance of DarkSuitAI Agent

func (*LLMArgs) SetChatInstruction added in v0.0.4

func (args *LLMArgs) SetChatInstruction(prompt []byte)
SetChatInstruction sets the chat instruction in LLMArgs.

This method allows you to define the main instruction or prompt that will guide the chat model's responses.

Example:

args := darksuitAI.NewLLMArgs()

args.SetChatInstruction([]byte("Your chat instruction goes here"))

In this example, the byte slice containing the chat instruction is set, which will be used by the chat model to generate responses.

func (*LLMArgs) SetChatSystemInstruction added in v0.0.4

func (args *LLMArgs) SetChatSystemInstruction(systemPrompt []byte)
SetChatSystemInstruction sets the system-level instruction in LLMArgs.

This method allows you to define the overarching system prompt that will guide the chat model's behavior.

Example:

args := darksuitAI.NewLLMArgs()

args.SetChatSystemInstruction([]byte("Your system prompt goes here"))

In this example, the byte slice containing the system prompt is set, which will be used by the chat model to maintain context and behavior.

func (*LLMArgs) SetModelType added in v0.0.4

func (args *LLMArgs) SetModelType(key, value string)
SetModelType sets a key-value pair in the ModelType map in LLMArgs.

This method allows you to specify the type of model to be used for the chat.

Example:

args := darksuitAI.NewLLMArgs()

args.SetModelType("openai", "gpt-4o")

In this example, the key "openai" with the value "gpt-4o" is added to the ModelType map, indicating the model type to be used.

func (*LLMArgs) SetMongoDBChatMemory added in v0.0.4

func (args *LLMArgs) SetMongoDBChatMemory(collection *mongo.Collection)
SetMongoDBChatMemory sets the MongoDB collection in LLMArgs.

This method allows you to set MongoDB that will be used for storing and retrieving chat-related data.

Example:

args := darksuitAI.NewLLMArgs()

args.SetMongoDBChatMemory(mongoCollection)

In this example, the MongoDB ChatMemory is set, which will be used for chat history operations.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL