Documentation ¶
Index ¶
- type LazyLRU
- func (lru *LazyLRU[K, V]) Close()
- func (lru *LazyLRU[K, V]) Delete(key K)
- func (lru *LazyLRU[K, V]) Get(key K) (V, bool)
- func (lru *LazyLRU[K, V]) IsRunning() bool
- func (lru *LazyLRU[K, V]) Len() int
- func (lru *LazyLRU[K, V]) MGet(keys ...K) map[K]V
- func (lru *LazyLRU[K, V]) MSet(keys []K, values []V) error
- func (lru *LazyLRU[K, V]) MSetTTL(keys []K, values []V, ttl time.Duration) error
- func (lru *LazyLRU[K, V]) Reap()
- func (lru *LazyLRU[K, V]) Set(key K, value V)
- func (lru *LazyLRU[K, V]) SetTTL(key K, value V, ttl time.Duration)
- func (lru *LazyLRU[K, V]) Stats() Stats
- type Stats
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type LazyLRU ¶
type LazyLRU[K comparable, V any] struct { // contains filtered or unexported fields }
LazyLRU is an LRU cache that only reshuffles values if it is somewhat full. This is a cache implementation that uses a hash table for lookups and a priority queue to approximate LRU. Approximate because the usage is not updated on every get. Rather, items close to the head of the queue, those most likely to be read again and least likely to age out, are not updated. This assumption does not hold under every condition -- if the cache is undersized and churning a lot, this implementation will perform worse than an LRU that updates on every read.
func New
deprecated
New creates a LazyLRU[string, interface{} with the given capacity and default expiration. This is compatible with the pre-generic interface. The generic version is available as `NewT`. If maxItems is zero or fewer, the cache will not hold anything, but does still incur some runtime penalties. If ttl is greater than zero, a background ticker will be engaged to proactively remove expired items.
Deprecated: To avoid the casting, use the generic NewT interface instead
func NewT ¶ added in v0.4.0
NewT creates a LazyLRU with the given capacity and default expiration. If maxItems is zero or fewer, the cache will not hold anything, but does still incur some runtime penalties. If ttl is greater than zero, a background ticker will be engaged to proactively remove expired items.
func (*LazyLRU[K, V]) Close ¶
func (lru *LazyLRU[K, V]) Close()
Close stops the reaper process. This is safe to call multiple times.
func (*LazyLRU[K, V]) Delete ¶ added in v0.3.0
func (lru *LazyLRU[K, V]) Delete(key K)
Delete elimitates a key from the cache. Removing a key that is not in the index is safe.
func (*LazyLRU[K, V]) Get ¶
Get retrieves a value from the cache. The returned bool indicates whether the key was found in the cache.
func (*LazyLRU[K, V]) MGet ¶
func (lru *LazyLRU[K, V]) MGet(keys ...K) map[K]V
MGet retrieves values from the cache. Missing values will not be returned.
func (*LazyLRU[K, V]) MSet ¶
MSet writes multiple keys and values to the cache. If the "key" and "value" parameters are of different lengths, this method will return an error.
func (*LazyLRU[K, V]) MSetTTL ¶
MSetTTL writes multiple keys and values to the cache, expiring with the given time-to-live value. If the "key" and "value" parameters are of different lengths, this method will return an error.
func (*LazyLRU[K, V]) Reap ¶
func (lru *LazyLRU[K, V]) Reap()
Reap removes all expired items from the cache