dotprompt

package
v0.0.0-...-3081fbb Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 14, 2024 License: Apache-2.0 Imports: 22 Imported by: 0

Documentation

Overview

Package dotprompt parses and renders dotprompt files.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func SetDirectory

func SetDirectory(directory string)

SetDirectory sets the directory where dotprompt files are read from.

Types

type Config

type Config struct {
	// The prompt variant.
	Variant string
	// The name of the model for which the prompt is input.
	// If this is non-empty, ModelAction should be nil.
	Model string

	// The ModelAction to use.
	// If this is non-nil, Model should be the empty string.
	ModelAction *ai.ModelAction

	// TODO(iant): document
	Tools []*ai.ToolDefinition

	// Number of candidates to generate when passing the prompt
	// to a model. If 0, uses 1.
	Candidates int

	// Details for the model.
	GenerationConfig *ai.GenerationCommonConfig

	InputSchema      *jsonschema.Schema // schema for input variables
	VariableDefaults map[string]any     // default input variable values

	// Desired output format.
	OutputFormat ai.OutputFormat

	// Desired output schema, for JSON output.
	OutputSchema map[string]any // TODO: use *jsonschema.Schema

	// Arbitrary metadata.
	Metadata map[string]any
}

Config is optional configuration for a Prompt.

type Prompt

type Prompt struct {
	// The name of the prompt. Optional unless the prompt is
	// registered as an action.
	Name string

	Config

	// The template for the prompt.
	Template *raymond.Template
	// contains filtered or unexported fields
}

Prompt is a parsed dotprompt file.

A dotprompt file consists of YAML frontmatter within --- lines, followed by a template written in the Handlebars language.

The YAML frontmatter will normally define a JSON schema describing the expected input and output variables. The input variables will appear in the template. The JSON schemas may be defined in a compact picoschema format.

The templates are evaluated with a couple of helpers.

  • {{role r}} changes to a new role for the following text
  • {{media url=URL}} adds a URL with an optional contentType

func Define

func Define(name, templateText string, cfg Config) (*Prompt, error)

Define creates and registers a new Prompt. This can be called from code that doesn't have a prompt file.

func New

func New(name, templateText string, cfg Config) (*Prompt, error)

New creates a new Prompt without registering it. This may be used for testing or for direct calls not using the genkit action and flow mechanisms.

func Open

func Open(name string) (*Prompt, error)

Open opens and parses a dotprompt file. The name is a base file name, without the ".prompt" extension.

func OpenVariant

func OpenVariant(name, variant string) (*Prompt, error)

OpenVariant opens a parses a dotprompt file with a variant. If the variant does not exist, the non-variant version is tried.

func Parse

func Parse(name, variant string, data []byte) (*Prompt, error)

Parse parses the contents of a dotprompt file.

func (*Prompt) Generate

Generate executes a prompt. It does variable substitution and passes the rendered template to the AI model specified by the prompt.

This implements the ai.Prompt interface.

func (*Prompt) Register

func (p *Prompt) Register() error

Register registers an action to render a prompt.

func (*Prompt) RenderMessages

func (p *Prompt) RenderMessages(variables map[string]any) ([]*ai.Message, error)

RenderMessages executes the prompt's template and converts it into messages. This just runs the template; it does not call a model.

func (*Prompt) RenderText

func (p *Prompt) RenderText(variables map[string]any) (string, error)

RenderText executes the prompt's template and returns the result as a string. The result may contain only a single, text, message. This just runs the template; it does not call a model.

type PromptRequest

type PromptRequest struct {
	// Input fields for the prompt. If not nil this should be a struct
	// or pointer to a struct that matches the prompt's input schema.
	Variables any `json:"variables,omitempty"`
	// Number of candidates to return; if 0, will be taken
	// from the prompt config; if still 0, will use 1.
	Candidates int `json:"candidates,omitempty"`
	// Model configuration. If nil will be taken from the prompt config.
	Config *ai.GenerationCommonConfig `json:"config,omitempty"`
	// Context to pass to model, if any.
	Context []any `json:"context,omitempty"`
	// The model to use. This overrides any model specified by the prompt.
	Model string `json:"model,omitempty"`
}

PromptRequest is a request to execute a dotprompt template and pass the result to a [ModelAction].

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL