llmcomposer

package module
v0.0.0-...-9a0c461 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 28, 2023 License: MIT Imports: 5 Imported by: 0

README

LLMComposer

Build Go Report Card GoDoc License

NOTICE: This is still a work-in-progress. The interfaces provided by this project are still subject to change.

LLMComposer is a Go framework for developing applications that leverage the power of language models. It uses composability and chain of responsibility patterns, and offers tools to access LLMs, build prompts, and chain calls together. The package also includes parsers and database integrations, making it an ideal solution for developers working with language models.

LLMComposer is heavily inspired by the LangChain project.

Usage

In this example we will use the OpenAI API to create a product name and a slogan. The first step is to create a chain that will generate a product name. We will use the LLM chain, which will call the OpenAI API and return the result. We will also use a Template prompt, which will replace the {product} placeholder with the given product.

Then, we will create a second chain to generate a slogan, using the same model with a different prompt.

Finally, we will create a third chain that will run the previous two chains in parallel, wait for both to complete and then combine the results with a final template used as chain function.

package main

import (
	"context"
	"fmt"

	"github.com/deluan/llmcomposer"
	"github.com/deluan/llmcomposer/chains"
	"github.com/deluan/llmcomposer/llms/openai"
	"github.com/deluan/llmcomposer/prompts"
)

func main() {
	ctx := context.Background()

	// Create a chain to generate a company name
	model1, err := openai.NewChatModel()
	if err != nil {
		panic(err)
	}
	prompt1 := prompts.Template(`What is a good name for a company that makes {product}?`)
	chain1 := chains.LLM(model1, prompt1, chains.WithOutputKey("name"))

	// Create a chain to generate a company slogan
	model2, err := openai.NewChatModel(openai.WithSameOptionsAs(model1.CompletionModel))
	if err != nil {
		panic(err)
	}
	prompt2 := prompts.Template(`What is a good slogan for a company that makes {product}?`)
	chain2 := chains.LLM(model2, prompt2, chains.WithOutputKey("slogan"))

	// Create a chain to run the previous ones in parallel, wait for both to complete and then
	// combine the results with a final prompt template converted to a chain function
	chain := chains.Sequential(
		chains.Parallel(2, chain1, chain2),
		chains.Prompt(prompts.Template(`The company {name} makes {product} and their slogan is {slogan}.`)),
	)

	// This call triggers the execution of all chains in the combined sequence
	result, err := chain.Call(ctx, llmcomposer.Values{"product": "colorful socks"})

	fmt.Println(result, err)

	// Output:
	// The company Rainbow Socks Co. makes colorful socks and their slogan is "Life is too short for boring socks – let us add some color to your steps!". <nil>
}

For more features and advanced usage, please check our examples folder.

Installation

To install LLMComposer, use the following command:

go get -u github.com/deluan/llmcomposer

Features

  • Access to LLMs and their capabilities
  • Tools to build prompts and parse outputs
  • Database integrations for seamless data storage and retrieval
  • Inspired by the langchain project, but striving to stay true to Go idioms

Usage

For examples and detailed usage instructions, please refer to the documentation (WIP). Also check the examples folder.

Contributing

We welcome contributions from the community! Please read our contributing guidelines for more information on how to get started.

License

LLMComposer is released under the MIT License.

Documentation

Index

Constants

View Source
const DefaultOutputKey = "text"

Variables

View Source
var (
	ErrInvalidPrompt = fmt.Errorf("invalid prompt")
	ErrArgs          = fmt.Errorf("invalid arguments")
	ErrMissingApiKey = fmt.Errorf("missing API key")
	ErrParser        = fmt.Errorf("error parsing output")
	ErrMissingInput  = fmt.Errorf("missing input value")
)

Functions

This section is empty.

Types

type Chain

type Chain interface {
	Call(context.Context, ...Values) (Values, error)
}

Chain wraps calls to "chain links", like LLMs, Databases, etc...

TODO: Add support for a context (for timeout, cancellation, etc...)
TODO: Maybe rename it to "Link"?

type ChainFunc

type ChainFunc func(context.Context, ...Values) (Values, error)

ChainFunc is a function that implements the Chain interface. It is used to wrap simple functions that can be used as chain links.

func (ChainFunc) Call

func (f ChainFunc) Call(ctx context.Context, values ...Values) (Values, error)

type ChatMessage

type ChatMessage struct {
	Text string
	Role string
}

type ChatMessages

type ChatMessages []ChatMessage

func (ChatMessages) Last

func (m ChatMessages) Last(size int) ChatMessages

func (ChatMessages) String

func (m ChatMessages) String() string

type ChatPrompt

type ChatPrompt interface {
	Messages(Values) []MessageTemplate
}

type Document

type Document struct {
	PageContent string
	Metadata    map[string]any
}

type DocumentLoader

type DocumentLoader interface {
	Load(ctx context.Context, splitter ...Splitter) ([]Document, error)
}

type DocumentLoaderFunc

type DocumentLoaderFunc func(ctx context.Context, splitter ...Splitter) ([]Document, error)

func (DocumentLoaderFunc) Load

func (f DocumentLoaderFunc) Load(ctx context.Context, splitter ...Splitter) ([]Document, error)

type Embeddings

type Embeddings interface {

	// EmbedDocuments returns the embeddings for the given documents
	EmbedDocuments(context.Context, []string) ([][]float32, error)

	// EmbedQuery returns the embedding for the given query
	EmbedQuery(context.Context, string) ([]float32, error)
}

Embeddings can be used to create a numerical representation of textual data. This numerical representation is useful because it can be used to find similar documents.

type FileStore

type FileStore interface {
	ReadFile(ctx context.Context, path string) (string, error)
	WriteFile(ctx context.Context, path, content string) error
	AppendToFile(ctx context.Context, path, content string) error
}

FileStore is an abstraction of a filesystem, used to read and write files.

type LLM

type LLM interface {
	Call(ctx context.Context, prompt Prompt, values ...Values) (string, error)
}

type Memory

type Memory interface {
	// LoadMemoryVariables loads the memory variables from the memory
	LoadMemoryVariables(context.Context, ...Values) (Values, error)

	// SaveContext saves the context to the memory
	SaveContext(ctx context.Context, input, output Values) error
}

type MessageTemplate

type MessageTemplate struct {
	Prompt Prompt
	Role   string
}

type OutputParser

type OutputParser[T any] interface {
	FormatInstructions() string
	Parse(text string) (T, error)
}

type Prompt

type Prompt interface {
	Format(...Values) string
}

type PromptFunc

type PromptFunc func(...Values) string

func (PromptFunc) Format

func (f PromptFunc) Format(values ...Values) string

type ScoredDocument

type ScoredDocument struct {
	Document
	Similarity float32
}

type Splitter

type Splitter = func(string) ([]string, error)

type Tool

type Tool interface {
	// Name returns the name of the tool.
	Name() string
	// Description returns a short description of the tool.
	Description() string
	// Args returns a list of arguments. Each argument must be a string with the format "name:description".
	Args() []string
	// Call is called to execute the tool with the given argument values.
	// The returned string has to be a valid JSON string.
	Call(ctx context.Context, args map[string]string) (string, error)
}

type Values

type Values map[string]any

func (Values) Keys

func (iv Values) Keys() []string

func (Values) Merge

func (iv Values) Merge(values ...Values) Values

func (Values) Stop

func (iv Values) Stop() []string

func (Values) String

func (iv Values) String() string

func (Values) WithStop

func (iv Values) WithStop(stop ...string) Values

type VectorStore

type VectorStore interface {
	// AddDocuments adds the given documents to the store
	AddDocuments(context.Context, ...Document) error
	// SimilaritySearch returns the k most similar documents to the query
	SimilaritySearch(ctx context.Context, query string, k int) ([]Document, error)
	// SimilaritySearchVectorWithScore returns the k most similar documents to the query, along with their similarity score
	SimilaritySearchVectorWithScore(ctx context.Context, query []float32, k int) ([]ScoredDocument, error)
}

VectorStore is a particular type of database optimized for storing documents and their embeddings, and then fetching of the most relevant documents for a particular query, ie. those whose embeddings are most similar to the embedding of the query.

Directories

Path Synopsis
examples
llms
Package pl implements some Data Pipeline helper functions.
Package pl implements some Data Pipeline helper functions.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL