anthropic

package module
v1.6.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 9, 2023 License: MIT Imports: 10 Imported by: 2

README

anthropic-sdk-go

Go Reference MIT Go Version GitHub release (latest by date)

GitHub Issues GitHub Repo stars GitHub repo size GitHub commit activity

Golang SDK for AnthRopic Claude AI


Features

  • Contextual sequential memory
  • Prompt automatically handles / Contextual automated processing
  • Concise and easy-to-use API
  • Fast data processing

This is a third-party Claude SDK GO library, starting March 23, 2023.

Claude Docs: https://console.anthropic.com/docs



Note

Anthropic began to block some areas and returned 403 errors. We have added inspections to V1.5.0. The example is as follows:

panic: check: Oh no, your region may be blocked by Anthropic! Please enable a proxy to bypass the block!

goroutine 1 [running]:
github.com/3JoB/anthropic-sdk-go.New({0xf774d2, 0x5}, {0x0?, 0x0})
        D:/Dev/Go/pW2/lib/anthropic-sdk-go/anthropic.go:41 +0x187
main.main()
        D:/Dev/Go/pW2/lib/anthropic-sdk-go/test/main.go:12 +0x3e

Start

Usage:

$ go get github.com/3JoB/anthropic-sdk-go@v1.6.0

Example usage:
package main

import (
	"fmt"

	"github.com/3JoB/anthropic-sdk-go"
	"github.com/3JoB/anthropic-sdk-go/data"
)

func main() {
	c, err := anthropic.New("api keys","modules")
	if err != nil {
		panic(err)
	}

	d, err := c.Send(&anthropic.Opts{
		Message: data.MessageModule{
			Human: "Do you know Golang, please answer me in the shortest possible way.",
		},
		Sender: anthropic.Sender{MaxToken: 1200},
	})

	if err != nil {
		panic(err)
	}

	fmt.Println(d.Response.String())
}

Return:

{"detail":null,"completion":"Hello world! \nfmt.Println(\"Hello world!\")\n\nDone.","stop_reason":"stop_sequence","stop":"\n\nHuman:","log_id":"nop","exception":"","model":"claude-instant-v1","truncated":false}

Context Example:

package main

import (
	"fmt"

	"github.com/3JoB/anthropic-sdk-go"
	"github.com/3JoB/anthropic-sdk-go/data"
)

func main() {
	c, err := anthropic.New("api keys","modules")
	if err != nil {
		panic(err)
	}

	d, err := c.Send(&anthropic.Opts{
		Message: data.MessageModule{
			Human: "Do you know Golang, please answer me in the shortest possible way.",
		},
		Sender: anthropic.Sender{MaxToken: 1200},
	})

	if err != nil {
		panic(err)
	}

	fmt.Println(d.Response.String())

	ds, err := c.Send(&anthropic.Opts{
		Message: data.MessageModule{
            Human: "What is its current version number?",
        },
		ContextID: d.ID,
        Sender: anthropic.Sender{MaxToken: 1200},
	})

	if err != nil {
		panic(err)
	}

	fmt.Println(ds.Response.String())
}

Return:

{"detail":null,"completion":"Hello world! \nfmt.Println(\"Hello world!\")\n\nDone.","stop_reason":"stop_sequence","stop":"\n\nHuman:","log_id":"nop","exception":"","model":"claude-instant-v1","truncated":false}
{"detail":null,"completion":"1.14.4 ","stop_reason":"stop_sequence","stop":"\n\nHuman:","log_id":"nop","exception":"","model":"claude-instant-v1","truncated":false}
Delete the context in an ID
c, err := anthropic.New("api keys","modules")
if err != nil {
	panic(err)
}

d, err := c.Send(&anthropic.Opts{
	Message: data.MessageModule{
		Human: "Do you know Golang, please answer me in the shortest possible way.",
	},
	Sender: anthropic.Sender{MaxToken: 1200},
})

if err != nil {
	panic(err)
}

d.Delete()

Other

This project only guarantees basic usability, if you need new features or improvements, please create a Pull Requests

Contact

Organize EMAIL: admin#zxda.top [# => @]


License

This software is distributed under MIT license.

Documentation

Index

Constants

View Source
const (
	API         string = "https://api.anthropic.com"
	APIComplete string = "/v1/complete"
	UserAgent   string = "" /* 163-byte string literal not displayed */
	SDKVersion  string = "1.6.0"

	ModelClaudeV1             string = "claude-v1"
	ModelClaudeDefault        string = "claude-v1.0"
	ModelClaudeV12            string = "claude-v1.2"
	ModelClaudeV13            string = "claude-v1.3"
	ModelClaudeInstantV1      string = "claude-instant-v1"
	ModelClaudeInstantDefault string = "claude-instant-v1.0"
)

Variables

View Source
var StopSequences []string = []string{"\n\nHuman:"}

Functions

func NewPool added in v1.4.0

func NewPool(key, defaultModel string) sync.Pool

func RefreshContext added in v1.2.0

func RefreshContext()

Types

type AnthropicClient

type AnthropicClient struct {
	Key          string // API Keys
	DefaultModel string // Choose the default AI model
	// contains filtered or unexported fields
}

func New added in v1.2.0

func New(key, defaultModel string) (*AnthropicClient, error)

Create a new Client object.

func (*AnthropicClient) ResetContextPool added in v1.6.0

func (ah *AnthropicClient) ResetContextPool()

func (*AnthropicClient) Send

func (ah *AnthropicClient) Send(senderOpts *Opts) (*Context, error)

Send data to the API endpoint. Before sending out, the data will be processed into a form that the API can recognize.

func (*AnthropicClient) SetTimeOut added in v1.2.0

func (ah *AnthropicClient) SetTimeOut(times int)

is minute

func (*AnthropicClient) TestBan added in v1.5.0

func (ah *AnthropicClient) TestBan() bool

type Context

type Context struct {
	ID       string // Context ID
	Human    string
	RawData  string // Unprocessed raw json data returned by the API endpoint
	Response *Response
}

func (*Context) Add added in v1.2.0

func (c *Context) Add() bool

Add a prompt to the context storage pool

func (*Context) Delete added in v1.2.0

func (c *Context) Delete()

func (*Context) Find added in v1.2.0

func (c *Context) Find() (v []data.MessageModule, ok bool)

func (*Context) Refresh added in v1.2.0

func (c *Context) Refresh()

Refresh the context storage pool (clear all data)

func (*Context) Set added in v1.2.0

func (c *Context) Set(value any) bool

type Opts added in v1.2.0

type Opts struct {
	Message   data.MessageModule
	ContextID string
	Sender    Sender
}

func (*Opts) Complete added in v1.2.0

func (req *Opts) Complete(ctx *Context, client *resty.Client) (*Context, error)

Make a processed request to an API endpoint.

type Response

type Response struct {
	Completion string `json:"completion"`          // The resulting completion up to and excluding the stop sequences.
	StopReason string `json:"stop_reason"`         // The reason we stopped sampling, either if we reached one of your provided , or if we exceeded `.stop_sequencestop_sequencesmax_tokensmax_tokens_to_sample`
	Stop       string `json:"stop"`                // If the is , this contains the actual stop sequence (of the list passed-in) that was `seenstop_reasonstop_sequencestop_sequences`
	LogID      string `json:"log_id"`              // The ID of the log that generated the response
	Exception  string `json:"exception,omitempty"` // exception
	Model      string `json:"model"`               // Model
	Truncated  bool   `json:"truncated"`           // truncated
	// contains filtered or unexported fields
}

func (*Response) String

func (resp *Response) String() string

type Sender

type Sender struct {
	Prompt        string   `json:"prompt"`                   // (required) The prompt you want Claude to complete. For proper response generation you will most likely want to format your prompt as follows:See [our comments on prompts](https://console.anthropic.com/docs/prompt-design#what-is-a-prompt) for more context.
	Model         string   `json:"model"`                    // (required) As we improve Claude, we develop new versions of it that you can query. This controls which version of Claude answers your request
	StopSequences []string `json:"stop_sequences,omitempty"` // (optional) A list of strings upon which to stop generating. You probably want , as that's the cue for the next turn in the dialog agent. Our client libraries provide a constant for this value (see examples below["\n\nHuman:"])
	Stream        bool     `json:"stream"`                   // (optional) Amount of randomness injected into the response. Ranges from 0 to 1. Use temp closer to 0 for analytical / multiple choice, and temp closer to 1 for creative and generative tasks.
	MaxToken      int      `json:"max_tokens_to_sample"`     // (required) A maximum number of tokens to generate before stopping.
	TopK          int      `json:"top_k,omitempty"`          // (optional) Only sample from the top K options for each subsequent token. Used to remove "long tail" low probability responses. Defaults to -1, which disables it.
	TopP          int      `json:"top_p,omitempty"`          // (optional) Does nucleus sampling, in which we compute the cumulative distribution over all the options for each subsequent token in decreasing probability order and cut it off once it reaches a particular probability specified by . Defaults to -1, which disables it. Note that you should either alter or , but not both.`top_ptemperaturetop_p“
}

Directories

Path Synopsis
internal

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL