conversation

package
v0.0.7 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 7, 2024 License: MIT Imports: 8 Imported by: 1

README

Bobatea Conversation UI

Go Report Card License

Bobatea Conversation UI is a Go library for rendering a conversation tree as a Bubble Tea text UI. It provides a simple way to display conversation messages in a terminal-based interface, intended to be nested within another model.

Installation

To use the Bobatea Conversation UI library in your Go project, run:

go get github.com/go-go-golems/bobatea/pkg/chat/conversation

Message Types

The library supports handling various message types for streaming operations:

  • StreamStartMsg: Sent when a streaming operation begins. The UI appends a new message to indicate the assistant has started processing.
  • StreamStatusMsg: Provides status updates during streaming. Can be used to show loading indicators.
  • StreamCompletionMsg: Sent when new data, such as a message completion, is available. The UI updates the message content.
  • StreamDoneMsg: Signals the successful completion of streaming. The UI finalizes the message content.
  • StreamCompletionError: Indicates an error occurred during streaming. The UI can display an error state.

These message types are used to communicate between the backend and the UI during streaming operations. The backend sends these messages to the UI through the Bubble Tea scheduler, and the UI updates the conversation tree accordingly. This allows for real-time updates and a smooth user experience as the assistant generates responses.

Usage

There is an example on how to use the library in cmd/conversation/main.go.

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type BorderColors

type BorderColors struct {
	Unselected string
	Selected   string
	Focused    string
}

type MessagePosition

type MessagePosition struct {
	Offset int
	Height int
}

type Model

type Model struct {
	// contains filtered or unexported fields
}

func NewModel

func NewModel(manager conversation.Manager) Model

func (Model) Conversation

func (m Model) Conversation() conversation.Conversation

func (Model) Init

func (m Model) Init() tea.Cmd

func (*Model) SelectedIdx

func (m *Model) SelectedIdx() int

func (*Model) SetActive

func (m *Model) SetActive(active bool)

func (*Model) SetSelectedIdx

func (m *Model) SetSelectedIdx(idx int)

func (*Model) SetWidth

func (m *Model) SetWidth(width int)

func (Model) Update

func (m Model) Update(msg tea.Msg) (Model, tea.Cmd)

func (Model) View

func (m Model) View() string

func (Model) ViewAndSelectedPosition

func (m Model) ViewAndSelectedPosition() (string, MessagePosition)

type StepMetadata

type StepMetadata struct {
	StepID     uuid.UUID `json:"step_id"`
	Type       string    `json:"type"`
	InputType  string    `json:"input_type"`
	OutputType string    `json:"output_type"`

	Metadata map[string]interface{} `json:"meta"`
}

StepMetadata represents metadata about the step that issues the streaming messages. There is not a real definition of what a streaming message right now, this will need to be cleaned up as the agent framework is built out. NOTE(manuel, 2024-01-17) This is a copy of the StepMetadata in geppetto, and we might want to extract this out into a separate steps package.

func (*StepMetadata) ToMap

func (sm *StepMetadata) ToMap() map[string]interface{}

type StreamCompletionError

type StreamCompletionError struct {
	StreamMetadata
	Err error
}

StreamCompletionError is sent by the backend when an error occurs during the streaming operation.

The UI uses this message to display an error state or message to the user.

type StreamCompletionMsg

type StreamCompletionMsg struct {
	StreamMetadata
	Delta      string
	Completion string
}

StreamCompletionMsg is sent by the backend when new data, such as a message completion, is available.

The UI uses this message to update the text content of the respective message in the conversation. The conversation manager updates the message content and the last update timestamp.

type StreamDoneMsg

type StreamDoneMsg struct {
	StreamMetadata
	Completion string
}

StreamDoneMsg is sent by the backend to signal the successful completion of the streaming operation.

The UI uses this message to finalize the content of the message in the conversation. The conversation manager updates the message content and the last update timestamp to reflect the final text.

type StreamMetadata

type StreamMetadata struct {
	ID       conversation.NodeID    `json:"id" yaml:"id"`
	ParentID conversation.NodeID    `json:"parent_id" yaml:"parent_id"`
	Metadata map[string]interface{} `json:"metadata" yaml:"metadata"`
	Step     *StepMetadata          `json:"step_metadata,omitempty"`
}

type StreamStartMsg

type StreamStartMsg struct {
	StreamMetadata
}

StreamStartMsg is sent by the backend when a streaming operation begins. The UI uses this message to append a new message to the conversation, indicating that the assistant has started processing. The conversation manager is responsible for adding this message to the conversation tree.

type StreamStatusMsg

type StreamStatusMsg struct {
	StreamMetadata
	Text string
}

StreamStatusMsg is sent by the backend to provide status updates during a streaming operation. It includes the current text of the stream along with the stream metadata.

The UI typically does not need to update the conversation view in response to this message, but it could be used to show a loading indicator or similar temporary status.

type Style

type Style struct {
	UnselectedMessage lipgloss.Style
	SelectedMessage   lipgloss.Style
	FocusedMessage    lipgloss.Style
}

func DefaultStyles

func DefaultStyles() *Style

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL