Documentation
¶
Index ¶
- Constants
- func FloatEntry(prec int) *entry
- func Marshal(data interface{}, id string) ([]byte, error)
- type Cachable
- type CacheScope
- type Config
- type Dictionary
- type Host
- type Message
- func (m *Message) BatchSize() int
- func (m *Message) Bytes() []byte
- func (m *Message) CacheHit(index int) bool
- func (m *Message) CacheKey() string
- func (m *Message) CacheKeyAt(index int) string
- func (m *Message) FlagCacheHit(index int)
- func (m *Message) FloatKey(key string, value float32)
- func (m *Message) FloatsKey(key string, values []float32)
- func (m *Message) IntKey(key string, value int)
- func (m *Message) IntsKey(key string, values []int)
- func (m *Message) Release()
- func (m *Message) SetBatchSize(batchSize int)
- func (m *Message) Size() int
- func (m *Message) StringKey(key, value string)
- func (m *Message) Strings() []string
- func (m *Message) StringsKey(key string, values []string)
- type Messages
- type Option
- func WithCacheScope(scope CacheScope) Option
- func WithCacheSize(sizeMB int) Option
- func WithDataStorer(storer datastore.Storer) Option
- func WithDebug(enable bool) Option
- func WithDictionary(dictionary *Dictionary) Option
- func WithGmetrics(gmetrics *gmetric.Service) Option
- func WithHashValidation(enable bool) Option
- func WithRemoteConfig(config *cconfig.Remote) Option
- type Releaser
- type Response
- type Service
Constants ¶
const ( CacheScopeLocal = CacheScope(1) CacheScopeL1 = CacheScope(2) CacheScopeL2 = CacheScope(4) )
Variables ¶
This section is empty.
Functions ¶
func FloatEntry ¶ added in v0.8.0
func FloatEntry(prec int) *entry
FloatEntry creates entry with prec digits after the decimal. A prec of less than or equal to 0 will be ignored.
Types ¶
type Cachable ¶
type Cachable interface { CacheKey() string CacheKeyAt(index int) string BatchSize() int FlagCacheHit(index int) CacheHit(index int) bool }
Cachable returns a cacheable key
type CacheScope ¶
type CacheScope int
func (CacheScope) IsL1 ¶
func (c CacheScope) IsL1() bool
func (CacheScope) IsL2 ¶
func (c CacheScope) IsL2() bool
func (CacheScope) IsLocal ¶
func (c CacheScope) IsLocal() bool
type Config ¶
type Config struct { Hosts []*Host Model string CacheSizeMb int CacheScope *CacheScope Datastore *config.Remote MaxRetry int Debug bool DictHashValidation bool }
Config represents a client config
type Dictionary ¶
type Dictionary struct {
// contains filtered or unexported fields
}
Dictionary helps identify any out-of-vocabulary input values for reducing the cache space - this enables us to leverage any dimensionality reduction within the model to optimize wall-clock performance. This is primarily useful for categorical inputs as well as any continous inputs with an acceptable quantization.
func NewDictionary ¶
func NewDictionary(dict *common.Dictionary, inputs []*shared.Field) *Dictionary
NewDictionary creates new Dictionary
func (*Dictionary) Fields ¶ added in v0.2.2
func (d *Dictionary) Fields() map[string]*shared.Field
TODO refactor, this has a singular use case
func (*Dictionary) KeysLen ¶
func (d *Dictionary) KeysLen() int
type Host ¶
Host represents endpoint host
func (*Host) IsSecurePort ¶
IsSecurePort() returns true if secure port
type Message ¶
type Message struct {
// contains filtered or unexported fields
}
Message represents the client-side perspective of the ML prediction. The JSON payload is built along the method calls; be sure to call (*Message).start() to set up the opening "{". TODO document how cache management is built into this type. There are 2 "modes" for building the message: single and batch modes. For single mode, the JSON object contents are written to Message.buf per method call. Single mode functions include:
(*Message).StringKey(string, string) (*Message).IntKey(string, int) (*Message).FloatKey(string, float32)
Batch mode is initiated by called (*Message).SetBatchSize() to a value greater than 0. For batch mode, the JSON payload is generated when (*Message).end() is called. Batch mode functions include (the type name is plural):
(*Message).StringsKey(string, []string) (*Message).IntsKey(string, []int) (*Message).FloatsKey(string, []float32)
There is no strict struct for request payload since some of the keys of the request are dynamically generated based on the model inputs. The resulting JSON will have property keys that are set based on the model, and two optional keys, "batch_size" and "cache_key". Depending on if single or batch mode, the property values will be scalars or arrays. See service.Request for server-side perspective. TODO separate out single and batch sized request to their respective calls endpoints; the abstracted polymorphism currently is more painful than convenient.
func (*Message) CacheKeyAt ¶
CacheKeyAt returns cache key for supplied index
func (*Message) FlagCacheHit ¶
func (*Message) SetBatchSize ¶ added in v0.1.3
func (*Message) StringsKey ¶
StringsKey sets key/values pair
type Messages ¶
type Messages interface {
Borrow() *Message
}
Messages represent a message
func NewMessages ¶
func NewMessages(newDict func() *Dictionary) Messages
NewMessages creates a new message grpcPool
type Option ¶
type Option interface { // Apply applies settings Apply(c *Service) }
Option is a pattern to apply a client option.
func WithCacheScope ¶
func WithCacheScope(scope CacheScope) Option
WithCacheScope creates cache scope option
func WithDataStorer ¶
func WithDictionary ¶
func WithDictionary(dictionary *Dictionary) Option
WithDictionary creates dictionary option
func WithGmetrics ¶
WithGmetrics returns gmetric options
func WithHashValidation ¶
WithHashValidation creates a new dict has validation
func WithRemoteConfig ¶
type Response ¶
type Response struct { Status string `json:"status"` Error string `json:"error,omitempty"` ServiceTime time.Duration `json:"serviceTime"` DictHash int `json:"dictHash"` Data interface{} `json:"data"` }
Response represents a response