gptscript

package module
v0.9.5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 27, 2024 License: Apache-2.0 Imports: 20 Imported by: 12

README

go-gptscript

This module provides a set of functions to interact with gptscripts. It allows for executing scripts, listing available tools and models, and more.

Installation

To use this module, you need to have Go installed on your system. Then, you can install the module via:

go get github.com/gptscript-ai/go-gptscript

Usage

To use the module, you need to first set the OPENAI_API_KEY environment variable to your OpenAI API key.

Additionally, you need the gptscript binary. You can install it on your system using the installation instructions. The binary can be on the PATH, or the GPTSCRIPT_BIN environment variable can be used to specify its location.

GPTScript

The GPTScript instance allows the caller to run gptscript files, tools, and other operations (see below). Note that the intention is that a single GPTScript instance is all you need for the life of your application, you should call Close() on the instance when you are done.

Global Options

When creating a GTPScript instance, you can pass the following global options. These options are also available as run Options. Anything specified as a run option will take precedence over the global option.

  • CacheDir: The directory to use for caching. Default (""), which uses the default cache directory.
  • APIKey: Specify an OpenAI API key for authenticating requests
  • BaseURL: A base URL for an OpenAI compatible API (the default is https://api.openai.com/v1)
  • DefaultModel: The default model to use for chat completion requests
  • DefaultModelProvider: The default model provider to use for chat completion requests
  • Env: Supply the environment variables. Supplying anything here means that nothing from the environment is used. The default is os.Environ(). Supplying Env at the run/evaluate level will be treated as "additional."

Run Options

These are optional options that can be passed to the various exec functions. None of the options is required, and the defaults will reduce the number of calls made to the Model API. As noted above, the Global Options are also available to specify here. These options would take precedence.

  • disableCache: Enable or disable caching. Default (false).
  • subTool: Use tool of this name, not the first tool
  • input: Input arguments for the tool run
  • workspace: Directory to use for the workspace, if specified it will not be deleted on exit
  • inlcudeEvents: Whether to include the streaming of events. Default (false). Note that if this is true, you must stream the events. See below for details.
  • chatState: The chat state to continue, or null to start a new chat and return the state
  • confirm: Prompt before running potentially dangerous commands
  • prompt: Allow prompting of the user

Functions

listModels

Lists all the available models, returns a list.

Usage:

package main

import (
	"context"

	"github.com/gptscript-ai/go-gptscript"
)

func listModels(ctx context.Context) ([]string, error) {
	g, err := gptscript.NewGPTScript(gptscript.GlobalOptions{})
	if err != nil {
		return nil, err
	}
	defer g.Close()

	return g.ListModels(ctx)
}
Parse

Parse file into a Tool data structure

package main

import (
	"context"

	"github.com/gptscript-ai/go-gptscript"
)

func parse(ctx context.Context, fileName string) ([]gptscript.Node, error) {
	g, err := gptscript.NewGPTScript(gptscript.GlobalOptions{})
	if err != nil {
		return nil, err
	}
	defer g.Close()

	return g.Parse(ctx, fileName)
}
ParseTool

Parse contents that represents a GPTScript file into a data structure.

package main

import (
	"context"

	"github.com/gptscript-ai/go-gptscript"
)

func parseTool(ctx context.Context, contents string) ([]gptscript.Node, error) {
	g, err := gptscript.NewGPTScript(gptscript.GlobalOptions{})
	if err != nil {
		return nil, err
	}
	defer g.Close()

	return g.ParseTool(ctx, contents)
}
Fmt

Parse convert a tool data structure into a GPTScript file.

package main

import (
	"context"

	"github.com/gptscript-ai/go-gptscript"
)

func parse(ctx context.Context, nodes []gptscript.Node) (string, error) {
	g, err := gptscript.NewGPTScript(gptscript.GlobalOptions{})
	if err != nil {
		return "", err
	}
	defer g.Close()

	return g.Fmt(ctx, nodes)
}
Evaluate

Executes a tool with optional arguments.

package main

import (
	"context"

	"github.com/gptscript-ai/go-gptscript"
)

func runTool(ctx context.Context) (string, error) {
	t := gptscript.ToolDef{
		Instructions: "who was the president of the united states in 1928?",
	}

	g, err := gptscript.NewGPTScript(gptscript.GlobalOptions{})
	if err != nil {
		return "", err
	}
	defer g.Close()

	run, err := g.Evaluate(ctx, gptscript.Options{}, t)
	if err != nil {
		return "", err
	}

	return run.Text()
}
Run

Executes a GPT script file with optional input and arguments. The script is relative to the callers source directory.

package main

import (
	"context"

	"github.com/gptscript-ai/go-gptscript"
)

func runFile(ctx context.Context) (string, error) {
	opts := gptscript.Options{
		DisableCache: &[]bool{true}[0],
		Input: "--input hello",
	}

	g, err := gptscript.NewGPTScript(gptscript.GlobalOptions{})
	if err != nil {
		return "", err
	}
	defer g.Close()

	run, err := g.Run(ctx, "./hello.gpt",  opts)
	if err != nil {
		return "", err
	}

	return run.Text()
}
Streaming events

In order to stream events, you must set IncludeEvents option to true. If you don't set this and try to stream events, then it will succeed, but you will not get any events. More importantly, if you set IncludeEvents to true, you must stream the events for the script to complete.

package main

import (
	"context"

	"github.com/gptscript-ai/go-gptscript"
)

func streamExecTool(ctx context.Context) error {
	opts := gptscript.Options{
		DisableCache:  &[]bool{true}[0],
		IncludeEvents: true,
		Input:         "--input world",
	}

	g, err := gptscript.NewGPTScript(gptscript.GlobalOptions{})
	if err != nil {
		return "", err
	}
	defer g.Close()

	run, err := g.Run(ctx, "./hello.gpt", opts)
	if err != nil {
		return err
	}

	for event := range run.Events() {
		// Process event...
	}

	_, err = run.Text()
	return err
}
Confirm

Using the Confirm: true option allows a user to inspect potentially dangerous commands before they are run. The caller has the ability to allow or disallow their running. In order to do this, a caller should look for the CallConfirm event. This also means that IncludeEvent should be true.

package main

import (
	"context"

	"github.com/gptscript-ai/go-gptscript"
)

func runFileWithConfirm(ctx context.Context) (string, error) {
	opts := gptscript.Options{
		DisableCache: &[]bool{true}[0],
		Input: "--input hello",
		Confirm: true,
		IncludeEvents: true,
	}

	g, err := gptscript.NewGPTScript(gptscript.GlobalOptions{})
	if err != nil {
		return "", err
	}
	defer g.Close()

	run, err := g.Run(ctx, "./hello.gpt",  opts)
	if err != nil {
		return "", err
	}

	for event := range run.Events() {
		if event.Call != nil && event.Call.Type == gptscript.EventTypeCallConfirm {
			// event.Tool has the information on the command being run.
			// and event.Input will have the input to the command being run.

			err = g.Confirm(ctx, gptscript.AuthResponse{
				ID: event.ID,
				Accept: true, // Or false if not allowed.
				Message: "", // A message explaining why the command is not allowed (ignored if allowed).
			})
			if err != nil {
				// Handle error
			}
		}

		// Process event...
	}

	return run.Text()
}
Prompt

Using the Prompt: true option allows a script to prompt a user for input. In order to do this, a caller should look for the Prompt event. This also means that IncludeEvent should be true. Note that if a Prompt event occurs when it has not explicitly been allowed, then the run will error.

package main

import (
	"context"

	"github.com/gptscript-ai/go-gptscript"
)

func runFileWithPrompt(ctx context.Context) (string, error) {
	opts := gptscript.Options{
		DisableCache: &[]bool{true}[0],
		Input: "--input hello",
		Prompt: true,
		IncludeEvents: true,
	}

	g, err := gptscript.NewGPTScript(gptscript.GlobalOptions{})
	if err != nil {
		return "", err
	}
	defer g.Close()

	run, err := g.Run(ctx, "./hello.gpt",  opts)
	if err != nil {
		return "", err
	}

	for event := range run.Events() {
		if event.Prompt != nil {
			// event.Prompt has the information to prompt the user.

			err = g.PromptResponse(ctx, gptscript.PromptResponse{
				ID: event.Prompt.ID,
				// Responses is a map[string]string of Fields to values
				Responses: map[string]string{
					event.Prompt.Fields[0]: "Some Value",
				},
			})
			if err != nil {
				// Handle error
			}
		}

		// Process event...
	}

	return run.Text()
}

Types

Tool Parameters
Argument Type Default Description
name string "" The name of the tool. Optional only on the first tool if there are multiple tools defined.
description string "" A brief description of what the tool does, this is important for explaining to the LLM when it should be used.
tools array [] An array of tools that the current tool might depend on or use.
maxTokens number/undefined undefined The maximum number of tokens to be used. Prefer undefined for uninitialized or optional values.
model string "" The model that the tool uses, if applicable.
cache boolean true Whether caching is enabled for the tool.
temperature number/undefined undefined The temperature setting for the model, affecting randomness. undefined for default behavior.
args object {} Additional arguments specific to the tool, described by key-value pairs.
internalPrompt boolean false An internal prompt used by the tool, if any.
instructions string "" Instructions on how to use the tool.
jsonResponse boolean false Whether the tool returns a JSON response instead of plain text. You must include the word 'json' in the body of the prompt

License

Copyright (c) 2024, Acorn Labs, Inc.

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Documentation

Index

Constants

View Source
const (
	ProviderToolCategory   ToolCategory = "provider"
	CredentialToolCategory ToolCategory = "credential"
	ContextToolCategory    ToolCategory = "context"
	InputToolCategory      ToolCategory = "input"
	OutputToolCategory     ToolCategory = "output"
	NoCategory             ToolCategory = ""

	EventTypeRunStart     EventType = "runStart"
	EventTypeCallStart    EventType = "callStart"
	EventTypeCallContinue EventType = "callContinue"
	EventTypeCallSubCalls EventType = "callSubCalls"
	EventTypeCallProgress EventType = "callProgress"
	EventTypeChat         EventType = "callChat"
	EventTypeCallConfirm  EventType = "callConfirm"
	EventTypeCallFinish   EventType = "callFinish"
	EventTypeRunFinish    EventType = "runFinish"

	EventTypePrompt EventType = "prompt"
)

Variables

This section is empty.

Functions

func GetEnv added in v0.9.4

func GetEnv(key, def string) string

func ObjectSchema

func ObjectSchema(kv ...string) *openapi3.Schema

Types

type AuthResponse

type AuthResponse struct {
	ID      string `json:"id"`
	Accept  bool   `json:"accept"`
	Message string `json:"message"`
}

type Call

type Call struct {
	ToolID string `json:"toolID,omitempty"`
	Input  string `json:"input,omitempty"`
}

type CallContext

type CallContext struct {
	ID           string          `json:"id"`
	Tool         Tool            `json:"tool"`
	AgentGroup   []ToolReference `json:"agentGroup,omitempty"`
	CurrentAgent ToolReference   `json:"currentAgent,omitempty"`
	DisplayText  string          `json:"displayText"`
	InputContext []InputContext  `json:"inputContext"`
	ToolCategory ToolCategory    `json:"toolCategory,omitempty"`
	ToolName     string          `json:"toolName,omitempty"`
	ParentID     string          `json:"parentID,omitempty"`
}

type CallFrame

type CallFrame struct {
	CallContext `json:",inline"`

	Type               EventType `json:"type"`
	Start              time.Time `json:"start"`
	End                time.Time `json:"end"`
	Input              string    `json:"input"`
	Output             []Output  `json:"output"`
	Usage              Usage     `json:"usage"`
	ChatResponseCached bool      `json:"chatResponseCached"`
	ToolResults        int       `json:"toolResults"`
	LLMRequest         any       `json:"llmRequest"`
	LLMResponse        any       `json:"llmResponse"`
}

type Credential added in v0.9.5

type Credential struct {
	Context      string            `json:"context"`
	ToolName     string            `json:"toolName"`
	Type         CredentialType    `json:"type"`
	Env          map[string]string `json:"env"`
	Ephemeral    bool              `json:"ephemeral,omitempty"`
	ExpiresAt    *time.Time        `json:"expiresAt"`
	RefreshToken string            `json:"refreshToken"`
}

type CredentialRequest added in v0.9.5

type CredentialRequest struct {
	Content     string   `json:"content"`
	AllContexts bool     `json:"allContexts"`
	Context     []string `json:"context"`
	Name        string   `json:"name"`
}

type CredentialType added in v0.9.5

type CredentialType string
const (
	CredentialTypeTool          CredentialType = "tool"
	CredentialTypeModelProvider CredentialType = "modelProvider"
)

type Document

type Document struct {
	Nodes []Node `json:"nodes,omitempty"`
}

type ErrNotFound added in v0.9.5

type ErrNotFound struct {
	Message string
}

func (ErrNotFound) Error added in v0.9.5

func (e ErrNotFound) Error() string

type EventType

type EventType string

type Frame

type Frame struct {
	Run    *RunFrame    `json:"run,omitempty"`
	Call   *CallFrame   `json:"call,omitempty"`
	Prompt *PromptFrame `json:"prompt,omitempty"`
}

type GPTScript

type GPTScript struct {
	// contains filtered or unexported fields
}

func NewGPTScript

func NewGPTScript(opts ...GlobalOptions) (*GPTScript, error)

func (*GPTScript) Close

func (g *GPTScript) Close()

func (*GPTScript) Confirm

func (g *GPTScript) Confirm(ctx context.Context, resp AuthResponse) error

func (*GPTScript) CreateCredential added in v0.9.5

func (g *GPTScript) CreateCredential(ctx context.Context, cred Credential) error

func (*GPTScript) DeleteCredential added in v0.9.5

func (g *GPTScript) DeleteCredential(ctx context.Context, credCtx, name string) error

func (*GPTScript) Evaluate

func (g *GPTScript) Evaluate(ctx context.Context, opts Options, tools ...ToolDef) (*Run, error)

func (*GPTScript) Fmt

func (g *GPTScript) Fmt(ctx context.Context, nodes []Node) (string, error)

Fmt will format the given nodes into a string.

func (*GPTScript) ListCredentials added in v0.9.5

func (g *GPTScript) ListCredentials(ctx context.Context, opts ListCredentialsOptions) ([]Credential, error)

func (*GPTScript) ListModels

func (g *GPTScript) ListModels(ctx context.Context, opts ...ListModelsOptions) ([]string, error)

ListModels will list all the available models.

func (*GPTScript) LoadContent added in v0.9.5

func (g *GPTScript) LoadContent(ctx context.Context, content string, opts ...LoadOptions) (*Program, error)

LoadContent will load the given content into a Program.

func (*GPTScript) LoadFile added in v0.9.5

func (g *GPTScript) LoadFile(ctx context.Context, fileName string, opts ...LoadOptions) (*Program, error)

LoadFile will load the given file into a Program.

func (*GPTScript) LoadTools added in v0.9.5

func (g *GPTScript) LoadTools(ctx context.Context, toolDefs []ToolDef, opts ...LoadOptions) (*Program, error)

LoadTools will load the given tools into a Program.

func (*GPTScript) Parse

func (g *GPTScript) Parse(ctx context.Context, fileName string, opts ...ParseOptions) ([]Node, error)

Parse will parse the given file into an array of Nodes.

func (*GPTScript) ParseContent added in v0.9.5

func (g *GPTScript) ParseContent(ctx context.Context, toolDef string) ([]Node, error)

ParseContent will parse the given string into a tool.

func (*GPTScript) PromptResponse

func (g *GPTScript) PromptResponse(ctx context.Context, resp PromptResponse) error

func (*GPTScript) RevealCredential added in v0.9.5

func (g *GPTScript) RevealCredential(ctx context.Context, credCtxs []string, name string) (Credential, error)

func (*GPTScript) Run

func (g *GPTScript) Run(ctx context.Context, toolPath string, opts Options) (*Run, error)

func (*GPTScript) URL added in v0.9.5

func (g *GPTScript) URL() string

func (*GPTScript) Version

func (g *GPTScript) Version(ctx context.Context) (string, error)

Version will return the output of `gptscript --version`

type GlobalOptions

type GlobalOptions struct {
	URL                  string   `json:"url"`
	Token                string   `json:"token"`
	OpenAIAPIKey         string   `json:"APIKey"`
	OpenAIBaseURL        string   `json:"BaseURL"`
	DefaultModel         string   `json:"DefaultModel"`
	DefaultModelProvider string   `json:"DefaultModelProvider"`
	CacheDir             string   `json:"CacheDir"`
	Env                  []string `json:"env"`
}

GlobalOptions allows specification of settings that are used for every call made. These options can be overridden by the corresponding Options.

type InputContext

type InputContext struct {
	ToolID  string `json:"toolID,omitempty"`
	Content string `json:"content,omitempty"`
}

type ListCredentialsOptions added in v0.9.5

type ListCredentialsOptions struct {
	CredentialContexts []string
	AllContexts        bool
}

type ListModelsOptions added in v0.9.5

type ListModelsOptions struct {
	Providers           []string
	CredentialOverrides []string
}

type LoadOptions added in v0.9.5

type LoadOptions struct {
	DisableCache bool
	SubTool      string
}

type Node

type Node struct {
	TextNode *TextNode `json:"textNode,omitempty"`
	ToolNode *ToolNode `json:"toolNode,omitempty"`
}

func ToolDefsToNodes added in v0.9.5

func ToolDefsToNodes(tools []ToolDef) []Node

type Options

type Options struct {
	GlobalOptions `json:",inline"`

	DisableCache        bool     `json:"disableCache"`
	Confirm             bool     `json:"confirm"`
	Input               string   `json:"input"`
	SubTool             string   `json:"subTool"`
	Workspace           string   `json:"workspace"`
	ChatState           string   `json:"chatState"`
	IncludeEvents       bool     `json:"includeEvents"`
	Prompt              bool     `json:"prompt"`
	CredentialOverrides []string `json:"credentialOverrides"`
	CredentialContexts  []string `json:"credentialContexts"`
	Location            string   `json:"location"`
	ForceSequential     bool     `json:"forceSequential"`
}

Options represents options for the gptscript tool or file.

type Output

type Output struct {
	Content  string          `json:"content"`
	SubCalls map[string]Call `json:"subCalls"`
}

type ParseOptions added in v0.9.5

type ParseOptions struct {
	DisableCache bool
}

type Program

type Program struct {
	Name        string  `json:"name,omitempty"`
	EntryToolID string  `json:"entryToolId,omitempty"`
	ToolSet     ToolSet `json:"toolSet,omitempty"`
}

type PromptFrame

type PromptFrame struct {
	ID        string            `json:"id,omitempty"`
	Type      EventType         `json:"type,omitempty"`
	Time      time.Time         `json:"time,omitempty"`
	Message   string            `json:"message,omitempty"`
	Fields    []string          `json:"fields,omitempty"`
	Sensitive bool              `json:"sensitive,omitempty"`
	Metadata  map[string]string `json:"metadata,omitempty"`
}

func (*PromptFrame) String

func (p *PromptFrame) String() string

type PromptResponse

type PromptResponse struct {
	ID        string            `json:"id,omitempty"`
	Responses map[string]string `json:"response,omitempty"`
}

type Repo

type Repo struct {
	VCS      string
	Root     string
	Path     string
	Name     string
	Revision string
}

type Run

type Run struct {
	// contains filtered or unexported fields
}

func (*Run) Bytes

func (r *Run) Bytes() ([]byte, error)

Bytes returns the output of the gptscript in bytes. It blocks until the output is ready.

func (*Run) Calls

func (r *Run) Calls() map[string]CallFrame

Calls will return a flattened array of the calls for this run.

func (*Run) ChatState

func (r *Run) ChatState() string

ChatState returns the current chat state of the Run.

func (*Run) Close

func (r *Run) Close() error

Close will stop the gptscript run, if it is running.

func (*Run) Err

func (r *Run) Err() error

Err returns the error that caused the gptscript to fail, if any.

func (*Run) ErrorOutput

func (r *Run) ErrorOutput() string

ErrorOutput returns the stderr output of the gptscript. Should only be called after Bytes or Text has returned an error.

func (*Run) Events

func (r *Run) Events() <-chan Frame

Events returns a channel that streams the gptscript events as they occur as Frames.

func (*Run) NextChat

func (r *Run) NextChat(ctx context.Context, input string) (*Run, error)

NextChat will pass input and create the next run in a chat. The new Run will be returned.

func (*Run) ParentCallFrame

func (r *Run) ParentCallFrame() (CallFrame, bool)

ParentCallFrame returns the CallFrame for the top-level or "parent" call. The boolean indicates whether there is a parent CallFrame.

func (*Run) Program

func (r *Run) Program() *Program

Program returns the gptscript program for the run.

func (*Run) RawOutput

func (r *Run) RawOutput() (map[string]any, error)

RawOutput returns the raw output of the gptscript. Most users should use Text or Bytes instead.

func (*Run) RespondingTool

func (r *Run) RespondingTool() Tool

RespondingTool returns the name of the tool that produced the output.

func (*Run) State

func (r *Run) State() RunState

State returns the current state of the gptscript.

func (*Run) Text

func (r *Run) Text() (string, error)

Text returns the text output of the gptscript. It blocks until the output is ready.

type RunFrame

type RunFrame struct {
	ID        string    `json:"id"`
	Program   Program   `json:"program"`
	Input     string    `json:"input"`
	Output    string    `json:"output"`
	Error     string    `json:"error"`
	Start     time.Time `json:"start"`
	End       time.Time `json:"end"`
	State     RunState  `json:"state"`
	ChatState any       `json:"chatState"`
	Type      EventType `json:"type"`
}

type RunState

type RunState string
const (
	Creating RunState = "creating"
	Running  RunState = "running"
	Continue RunState = "continue"
	Finished RunState = "finished"
	Error    RunState = "error"
)

func (RunState) IsTerminal

func (rs RunState) IsTerminal() bool

type TextNode

type TextNode struct {
	Fmt  string `json:"fmt,omitempty"`
	Text string `json:"text,omitempty"`
}

type Tool

type Tool struct {
	ToolDef     `json:",inline"`
	ID          string                     `json:"id,omitempty"`
	Arguments   *openapi3.Schema           `json:"arguments,omitempty"`
	ToolMapping map[string][]ToolReference `json:"toolMapping,omitempty"`
	LocalTools  map[string]string          `json:"localTools,omitempty"`
	Source      ToolSource                 `json:"source,omitempty"`
	WorkingDir  string                     `json:"workingDir,omitempty"`
}

type ToolCategory

type ToolCategory string

type ToolDef

type ToolDef struct {
	Name                string            `json:"name,omitempty"`
	Description         string            `json:"description,omitempty"`
	MaxTokens           int               `json:"maxTokens,omitempty"`
	ModelName           string            `json:"modelName,omitempty"`
	ModelProvider       bool              `json:"modelProvider,omitempty"`
	JSONResponse        bool              `json:"jsonResponse,omitempty"`
	Chat                bool              `json:"chat,omitempty"`
	Temperature         *float32          `json:"temperature,omitempty"`
	Cache               *bool             `json:"cache,omitempty"`
	InternalPrompt      *bool             `json:"internalPrompt"`
	Arguments           *openapi3.Schema  `json:"arguments,omitempty"`
	Tools               []string          `json:"tools,omitempty"`
	GlobalTools         []string          `json:"globalTools,omitempty"`
	GlobalModelName     string            `json:"globalModelName,omitempty"`
	Context             []string          `json:"context,omitempty"`
	ExportContext       []string          `json:"exportContext,omitempty"`
	Export              []string          `json:"export,omitempty"`
	Agents              []string          `json:"agents,omitempty"`
	Credentials         []string          `json:"credentials,omitempty"`
	ExportCredentials   []string          `json:"exportCredentials,omitempty"`
	InputFilters        []string          `json:"inputFilters,omitempty"`
	ExportInputFilters  []string          `json:"exportInputFilters,omitempty"`
	OutputFilters       []string          `json:"outputFilters,omitempty"`
	ExportOutputFilters []string          `json:"exportOutputFilters,omitempty"`
	Instructions        string            `json:"instructions,omitempty"`
	Type                string            `json:"type,omitempty"`
	MetaData            map[string]string `json:"metadata,omitempty"`
}

ToolDef struct represents a tool with various configurations.

type ToolNode

type ToolNode struct {
	Fmt  string `json:"fmt,omitempty"`
	Tool Tool   `json:"tool,omitempty"`
}

type ToolReference

type ToolReference struct {
	Named     string `json:"named,omitempty"`
	Reference string `json:"reference,omitempty"`
	Arg       string `json:"arg,omitempty"`
	ToolID    string `json:"toolID,omitempty"`
}

type ToolSet

type ToolSet map[string]Tool

type ToolSource

type ToolSource struct {
	Location string `json:"location,omitempty"`
	LineNo   int    `json:"lineNo,omitempty"`
	Repo     *Repo  `json:"repo,omitempty"`
}

type Usage

type Usage struct {
	PromptTokens     int `json:"promptTokens,omitempty"`
	CompletionTokens int `json:"completionTokens,omitempty"`
	TotalTokens      int `json:"totalTokens,omitempty"`
}

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL