client

package
v0.14.5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 10, 2024 License: Apache-2.0 Imports: 34 Imported by: 0

Documentation

Index

Constants

View Source
const (
	CacheScopeLocal = CacheScope(1)
	CacheScopeL1    = CacheScope(2)
	CacheScopeL2    = CacheScope(4)
)

Variables

This section is empty.

Functions

func FloatEntry added in v0.8.0

func FloatEntry(prec int) *entry

FloatEntry creates entry with prec digits after the decimal. A prec of less than or equal to 0 will be ignored.

func Marshal added in v0.8.0

func Marshal(data interface{}, id string) ([]byte, error)

Types

type Cachable

type Cachable interface {
	CacheKey() string
	CacheKeyAt(index int) string
	BatchSize() int
	FlagCacheHit(index int)
	CacheHit(index int) bool
}

Cachable returns a cacheable key

type CacheScope

type CacheScope int

func (CacheScope) IsL1

func (c CacheScope) IsL1() bool

func (CacheScope) IsL2

func (c CacheScope) IsL2() bool

func (CacheScope) IsLocal

func (c CacheScope) IsLocal() bool

type Config

type Config struct {
	Hosts []*Host
	Model string

	CacheSizeMb int

	// CacheScope limits which caches are available to this client.
	CacheScope *CacheScope

	Datastore *config.Remote

	// MaxRetry defines the maximum number of HTTP requests that should be sent
	// during a shared/client.(*Service).Run()
	MaxRetry int

	Debug              bool
	DictHashValidation bool
}

Config represents a client config

func (*Config) CacheSize

func (c *Config) CacheSize() int

CacheSize returns cache size

type Dictionary

type Dictionary struct {
	// contains filtered or unexported fields
}

Dictionary helps identify any out-of-vocabulary input values for reducing the cache space - this enables us to leverage any dimensionality reduction within the model to optimize wall-clock performance. This is primarily useful for categorical inputs as well as any continous inputs with an acceptable quantization.

func NewDictionary

func NewDictionary(dict *common.Dictionary, inputs []*shared.Field) *Dictionary

NewDictionary creates new Dictionary

func (*Dictionary) Fields added in v0.2.2

func (d *Dictionary) Fields() map[string]*shared.Field

TODO refactor, this has a singular use case

func (*Dictionary) KeysLen

func (d *Dictionary) KeysLen() int

type Host

type Host struct {

	// used as both a check to see if a host is down
	// and to pause requests to prevent downstream overload
	RequestTimeout time.Duration

	*circut.Breaker
	// contains filtered or unexported fields
}

Host represents endpoint host

func NewHost

func NewHost(name string, port int) *Host

NewHost returns new host

func NewHosts

func NewHosts(port int, names []string) []*Host

NewHosts creates hosts

func (*Host) Init

func (h *Host) Init()

func (*Host) IsSecurePort

func (h *Host) IsSecurePort() bool

IsSecurePort() returns true if secure port

func (*Host) Name

func (h *Host) Name() string

Name aka domain name

func (*Host) Port

func (h *Host) Port() int

func (*Host) Probe

func (h *Host) Probe()

called by shared/circuit.(*Breaker).resetIfDue()

type Message

type Message struct {
	// contains filtered or unexported fields
}

Message represents the client-side perspective of the ML prediction. The JSON payload is built along the method calls; be sure to call (*Message).start() to set up the opening "{". TODO document how cache management is built into this type. There are 2 "modes" for building the message: single and batch modes. For single mode, the JSON object contents are written to Message.buf per method call. Single mode functions include:

(*Message).StringKey(string, string)
(*Message).IntKey(string, int)
(*Message).FloatKey(string, float32)

Batch mode is initiated by called (*Message).SetBatchSize() to a value greater than 0. For batch mode, the JSON payload is generated when (*Message).end() is called. Batch mode functions include (the type name is plural):

(*Message).StringsKey(string, []string)
(*Message).IntsKey(string, []int)
(*Message).FloatsKey(string, []float32)

There is no strict struct for request payload since some of the keys of the request are dynamically generated based on the model inputs. The resulting JSON will have property keys that are set based on the model, and two optional keys, "batch_size" and "cache_key". Depending on if single or batch mode, the property values will be scalars or arrays. See service.Request for server-side perspective. TODO separate out single and batch sized request to their respective calls endpoints; the abstracted polymorphism currently is more painful than convenient.

func (*Message) BatchSize

func (m *Message) BatchSize() int

func (*Message) Bytes

func (m *Message) Bytes() []byte

Bytes returns message bytes

func (*Message) CacheHit

func (m *Message) CacheHit(index int) bool

func (*Message) CacheKey

func (m *Message) CacheKey() string

CacheKey returns cache key

func (*Message) CacheKeyAt

func (m *Message) CacheKeyAt(index int) string

CacheKeyAt returns cache key for supplied index

func (*Message) FlagCacheHit

func (m *Message) FlagCacheHit(index int)

func (*Message) FloatKey

func (m *Message) FloatKey(key string, value float32)

FloatKey sets key/value pair

func (*Message) FloatsKey

func (m *Message) FloatsKey(key string, values []float32)

FloatsKey sets key/values pair

func (*Message) IntKey

func (m *Message) IntKey(key string, value int)

IntKey sets key/value pair

func (*Message) IntsKey

func (m *Message) IntsKey(key string, values []int)

IntsKey sets key/values pair

func (*Message) Release

func (m *Message) Release()

Release releases message to the grpcPool TODO this should not be public, Service should be 100% responsible for reuse OR TODO caller should be 100% responsible for reuse

func (*Message) SetBatchSize added in v0.1.3

func (m *Message) SetBatchSize(batchSize int)

func (*Message) Size

func (m *Message) Size() int

Size returns message size

func (*Message) StringKey

func (m *Message) StringKey(key, value string)

StringKey sets key/value pair

func (*Message) Strings added in v0.2.2

func (m *Message) Strings() []string

Strings is used to debug the current message.

func (*Message) StringsKey

func (m *Message) StringsKey(key string, values []string)

StringsKey sets key/values pair

type Messages

type Messages interface {
	Borrow() *Message
}

Messages represent a message

func NewMessages

func NewMessages(newDict func() *Dictionary) Messages

NewMessages creates a new message grpcPool

type Option

type Option interface {
	// Apply applies settings
	Apply(c *Service)
}

Option is a pattern to apply a client option.

func WithCacheScope

func WithCacheScope(scope CacheScope) Option

WithCacheScope creates cache scope option

func WithCacheSize

func WithCacheSize(sizeMB int) Option

WithCacheSize overrides the cache size provided by the server.

func WithDataStorer

func WithDataStorer(storer datastore.Storer) Option

WithDataStorer will provide a coded instane of datastore instead of determining it based off configuration values.

func WithDebug added in v0.6.0

func WithDebug(enable bool) Option

WithDebug sets debugging.

func WithDictionary

func WithDictionary(dictionary *Dictionary) Option

WithDictionary overwrites the initial dictionary.

func WithGmetrics

func WithGmetrics(gmetrics *gmetric.Service) Option

WithGmetrics binds the *gmetric.Service to the client.

func WithHashValidation

func WithHashValidation(enable bool) Option

WithHashValidation overrides DictHashValidation.

func WithRemoteConfig

func WithRemoteConfig(config *cconfig.Remote) Option

WithRemoteConfig will provide a coded configuration instead of sending a request to the mly server to fetch the configuration.

type Releaser

type Releaser interface {
	Release()
}

Releaser releaser

type Response

type Response struct {
	Status      string        `json:"status"`
	Error       string        `json:"error,omitempty"`
	ServiceTime time.Duration `json:"serviceTime"`
	DictHash    int           `json:"dictHash"`
	Data        interface{}   `json:"data"`
}

Response represents a response

func NewResponse

func NewResponse(data interface{}) *Response

NewResponse creates a new response

func (*Response) DataItemType

func (r *Response) DataItemType() (reflect.Type, error)

func (*Response) NKeys

func (r *Response) NKeys() int

NKeys returns object keys JSON (gojay API)

func (*Response) UnmarshalJSONObject

func (r *Response) UnmarshalJSONObject(dec *gojay.Decoder, key string) error

UnmarshalJSONObject unmsrhal JSON (gojay API)

type Service

type Service struct {
	Config

	sync.RWMutex

	ErrorHistory tracker.Tracker
	// contains filtered or unexported fields
}

Service represent mly client

func New

func New(model string, hosts []*Host, options ...Option) (*Service, error)

New creates new client.

func (*Service) Close

func (s *Service) Close() error

func (*Service) NewMessage

func (s *Service) NewMessage() *Message

NewMessage returns a new message

func (*Service) Run

func (s *Service) Run(ctx context.Context, input interface{}, response *Response) error

Run will fetch the model prediction and populate response.Data with the result. input can vary in types, but if it is an instance of Cachable, then the configured caching system will be used.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL