Documentation ¶
Overview ¶
Package cache provides general-purpose in-memory caches. Different caches provide different eviction policies suitable for specific use cases.
Package cache implements a LRU cache.
The implementation borrows heavily from SmallLRUCache (originally by Nathan Schrenk). The object maintains a doubly-linked list of elements. When an element is accessed, it is promoted to the head of the list. When space is needed, the element at the tail of the list (the least recently used element) is evicted.
Index ¶
- type Cache
- type EvictionCallback
- type ExpiringCache
- type Item
- type LRUCache
- func (lru *LRUCache) Capacity() int64
- func (lru *LRUCache) Clear()
- func (lru *LRUCache) Delete(key string) bool
- func (lru *LRUCache) Evictions() int64
- func (lru *LRUCache) Get(key string) (v Value, ok bool)
- func (lru *LRUCache) Items() []Item
- func (lru *LRUCache) Keys() []string
- func (lru *LRUCache) Length() int64
- func (lru *LRUCache) Oldest() (oldest time.Time)
- func (lru *LRUCache) Peek(key string) (v Value, ok bool)
- func (lru *LRUCache) Set(key string, value Value)
- func (lru *LRUCache) SetCapacity(capacity int64)
- func (lru *LRUCache) SetIfAbsent(key string, value Value)
- func (lru *LRUCache) Size() int64
- func (lru *LRUCache) Stats() (length, size, capacity, evictions int64, oldest time.Time)
- func (lru *LRUCache) StatsJSON() string
- type Stats
- type Value
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Cache ¶
type Cache interface { // Set inserts an entry in the cache. This will replace any entry with // the same key that is already in the cache. The entry may be automatically // expunged from the cache at some point, depending on the eviction policies // of the cache and the options specified when the cache was created. Set(key interface{}, value interface{}) // Get retrieves the value associated with the supplied key if the key // is present in the cache. Get(key interface{}) (value interface{}, ok bool) // Remove synchronously deletes the given key from the cache. This has no effect if the key is not // currently in the cache. Remove(key interface{}) // RemoveAll synchronously deletes all entries from the cache. RemoveAll() // Stats returns information about the efficiency of the cache. Stats() Stats }
Cache defines the standard behavior of in-memory thread-safe caches.
Different caches can have different eviction policies which determine when and how entries are automatically removed from the cache.
Using a cache is very simple:
c := NewLRU(5*time.Second, // default per-entry ttl 5*time.Second, // eviction interval 500) // max # of entries tracked c.Set("foo", "bar") // add an entry value, ok := c.Get("foo") // try to retrieve the entry if ok { fmt.Printf("Got value %v\n", value) } else { fmt.Printf("Value was not found, must have been evicted") }
type EvictionCallback ¶
type EvictionCallback func(key, value interface{})
EvictionCallback is a function that will be called on entry eviction from an ExpiringCache.
This callback will be invoked immediately after the entry is deleted from the `sync.Map` that backs this cache (using `Map.Delete()`). No locks are held during the invocation of this callback. The callback should not result in blocking calls to long-running operations, however.
type ExpiringCache ¶
type ExpiringCache interface { Cache // SetWithExpiration inserts an entry in the cache with a requested expiration time. // This will replace any entry with the same key that is already in the cache. // The entry will be automatically expunged from the cache at or slightly after the // requested expiration time. SetWithExpiration(key interface{}, value interface{}, expiration time.Duration) // EvictExpired() synchronously evicts all expired entries from the cache EvictExpired() }
ExpiringCache is a cache with entries that are evicted over time
func NewLRU ¶
func NewLRU(defaultExpiration time.Duration, evictionInterval time.Duration, maxEntries int32) ExpiringCache
NewLRU creates a new cache with an LRU and time-based eviction model.
Cache eviction is done on a periodic basis. Individual cache entries are evicted after their expiration time has passed. The periodic nature of eviction means that cache entries tend to survive around (expirationTime + (evictionInterval / 2))
In addition, when the cache is full, adding a new item will displace the item that has been referenced least recently.
defaultExpiration specifies the default minimum amount of time a cached entry remains in the cache before eviction. This value is used with the Set function. Explicit per-entry expiration times can be set with the SetWithExpiration function instead.
evictionInterval specifies the frequency at which eviction activities take place. This should likely be >= 1 second.
func NewTTL ¶
func NewTTL(defaultExpiration time.Duration, evictionInterval time.Duration) ExpiringCache
NewTTL creates a new cache with a time-based eviction model.
Cache eviction is done on a periodic basis. Individual cache entries are evicted after their expiration time has passed. The periodic nature of eviction means that cache entries tend to survive around (expirationTime + (evictionInterval / 2))
defaultExpiration specifies the default minimum amount of time a cached entry remains in the cache before eviction. This value is used with the Set function. Explicit per-entry expiration times can be set with the SetWithExpiration function instead.
evictionInterval specifies the frequency at which eviction activities take place. This should likely be >= 1 second.
Since TTL caches only evict data based on the passage of time, it's possible to use up all available memory by continuing to add entries to the cache with a long enough expiration time. Don't do that.
func NewTTLWithCallback ¶
func NewTTLWithCallback(defaultExpiration time.Duration, evictionInterval time.Duration, callback EvictionCallback) ExpiringCache
NewTTLWithCallback creates a new cache with a time-based eviction model that will invoke the supplied callback on all evictions. See also: NewTTL.
type LRUCache ¶
type LRUCache struct {
// contains filtered or unexported fields
}
LRUCache is a typical LRU cache implementation. If the cache reaches the capacity, the least recently used item is deleted from the cache. Note the capacity is not the number of items, but the total sum of the Size() of each item.
func NewLRUCache ¶
NewLRUCache creates a new empty cache with the given capacity.
func (*LRUCache) Get ¶
Get returns a value from the cache, and marks the entry as most recently used.
func (*LRUCache) Items ¶
Items returns all the values for the cache, ordered from most recently used to least recently used.
func (*LRUCache) Keys ¶
Keys returns all the keys for the cache, ordered from most recently used to least recently used.
func (*LRUCache) Oldest ¶
Oldest returns the insertion time of the oldest element in the cache, or a IsZero() time if cache is empty.
func (*LRUCache) SetCapacity ¶
SetCapacity will set the capacity of the cache. If the capacity is smaller, and the current cache size exceed that capacity, the cache will be shrank.
func (*LRUCache) SetIfAbsent ¶
SetIfAbsent will set the value in the cache if not present. If the value exists in the cache, we don't set it.
type Stats ¶
type Stats struct { // Writes captures the number of times state in the cache was added or updated. Writes uint64 // Hits captures the number of times a Get operation succeeded to find an entry in the cache. Hits uint64 // Misses captures the number of times a Get operation failed to find an entry in the cache. Misses uint64 // Evictions captures the number of entries that have been evicted from the cache Evictions uint64 // Removals captures the number of entries that have been explicitly removed from the // cache Removals uint64 }
Stats returns usage statistics about an individual cache, useful to assess the efficiency of a cache.
The values returned in this struct are approximations of the current state of the cache. For the sake of efficiency, certain edge cases in the implementation can lead to inaccuracies.