sillybot

package module
v0.0.0-...-fa372ed Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 30, 2024 License: Apache-2.0 Imports: 12 Imported by: 0

README

Silly Bot

A simple Discord and Slack bot written in Go that natively serves LLM and Stable Diffusion to chat, generate images, and memes!

⚠️Warning⚠️ : This is a work in progress. There is no privacy control. Please file an issue for feature requests or bugs found.

I made a talk about it: youtu.be/uGkrjsMl-ok! sillybot

Usage

Type in chat:

/meme_auto sillybot meme generator awesome

Receive:

Meme Lord in Training

Talk with it and use its commands as described at:

Features

Hardware requirement

  • LLM: any <4 years old computer, really. A GPU is not required. If you are unsure, start with Qwen2 0.5B by using model: "qwen2-0_5b-instruct-q5_k_m" in the llm: section of config.yml. This requires 1GiB of RAM. Go up with larger models from there.
    • I recommend to chat with the bot, use the /metrics command and iterate to use increasingly larger models as long as you can keep >=200tok/s for prompt parsing and >=20tok/s for generation.
    • When using models with large context window (>8K) like Llama 3.1 and Mistral Nemo, you may want to explicitly limit the context window to a smaller size like 16384, 32768 or 65536 so it reduces memory requirements. Counter-intuitively, reducing the context window too much will slow down generation.
  • Image Generation: a nVidia GPU with 6GiB of video memory (VRAM) (12GiB is better) or a MacBook Pro. While it works on CPU, expect a minute or two to generate each image.

Pro-tip: You can use 2 computers: one running the LLM and one the image generation! Start the llm/imagegen server manually then use the remote: option in config.yml.

Installation

Both function essentially the same but the Application configuration on the server (Discord or Slack) is different:

Dev

Go Reference codecov

Acknowledgements

This project greatly benefit from llama.cpp by Georgi Gerganov, previous versions leveraged llamafile by Justine Tunney, all open source contributors and all the companies providing open-weights models.

Author

sillybot was created with ❤️️ and passion by Marc-Antoine Ruel.

Documentation

Overview

Package sillybot implements the common code used by both discord-bot and slack-bot.

Index

Constants

This section is empty.

Variables

View Source
var DefaultConfig []byte

Default configuration with well known models and sane presets.

Functions

func LoadModels

func LoadModels(ctx context.Context, cache string, cfg *Config) (*llm.Session, *imagegen.Session, error)

LoadModels loads the LLM and ImageGen models.

Both take a while to start, so load them in parallel for faster initialization.

Types

type Config

type Config struct {
	Bot struct {
		LLM      llm.Options
		ImageGen imagegen.Options `yaml:"image_gen"`
		Settings Settings
	}
	KnownLLMs []llm.KnownLLM
}

Config defines the configuration format.

func (*Config) LoadOrDefault

func (c *Config) LoadOrDefault(config string) error

LoadOrDefault loads a config or write the default to disk.

func (*Config) Validate

func (c *Config) Validate() error

Validate checks for obvious errors in the fields.

type Settings

type Settings struct {
	// PromptSystem is the default system prompt to use. Is a Go template as
	// documented at https://pkg.go.dev/text/template. Values provided by LLM are:
	// - Now: current time in ISO-8601, including the server's time zone.
	// - Model: the model name.
	PromptSystem string `yaml:"prompt_system"`
	// PromptLabels is the prompt used to generate meme labels via a short
	// description.
	PromptLabels string `yaml:"prompt_labels"`
	// PromptImage is the prompt used to generate an image via a short
	// description.
	PromptImage string `yaml:"prompt_image"`
}

Settings is the bot settings.

Directories

Path Synopsis
cmd
discord-bot
Silly bot to chat with.
Silly bot to chat with.
slack-bot
Silly bot to chat with.
Silly bot to chat with.
Package huggingface is the best library to fetch files from an huggingface repository.
Package huggingface is the best library to fetch files from an huggingface repository.
Package imagegen runs an image generator.
Package imagegen runs an image generator.
Package internal contains various random shared code.
Package internal contains various random shared code.
llm
Package llm runs a LLM locally via llama.cpp, llamafile, or with a python server.
Package llm runs a LLM locally via llama.cpp, llamafile, or with a python server.
tools
Package tools contains structures to generate function calls, tool calling from LLMs.
Package tools contains structures to generate function calls, tool calling from LLMs.
Package py manages the python backends.
Package py manages the python backends.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL