Documentation ¶
Overview ¶
Package lru provides three different LRU caches of varying sophistication.
Cache is a simple LRU cache. It is based on the LRU implementation in groupcache: https://github.com/golang/groupcache/tree/master/lru
TwoQueueCache tracks frequently used and recently used entries separately. This avoids a burst of accesses from taking out frequently used entries, at the cost of about 2x computational overhead and some extra bookkeeping.
ARCCache is an adaptive replacement cache. It tracks recent evictions as well as recent usage in both the frequent and recent caches. Its computational overhead is comparable to TwoQueueCache, but the memory overhead is linear with the size of the cache.
ARC has been patented by IBM, so do not use it if that is problematic for your program.
All caches in this package take locks while operating, and are therefore thread-safe for consumers.
Index ¶
- Constants
- type ARCCache
- func (c *ARCCache) Add(key, value interface{})
- func (c *ARCCache) Contains(key interface{}) bool
- func (c *ARCCache) Get(key interface{}) (value interface{}, ok bool)
- func (c *ARCCache) Keys() []interface{}
- func (c *ARCCache) Len() int
- func (c *ARCCache) Peek(key interface{}) (value interface{}, ok bool)
- func (c *ARCCache) Purge()
- func (c *ARCCache) Remove(key interface{})
- type Cache
- func (c *Cache) Add(key, value interface{}) (evicted bool)
- func (c *Cache) Contains(key interface{}) bool
- func (c *Cache) ContainsOrAdd(key, value interface{}) (ok, evicted bool)
- func (c *Cache) Get(key interface{}) (value interface{}, ok bool)
- func (c *Cache) Keys() []interface{}
- func (c *Cache) Len() int
- func (c *Cache) Peek(key interface{}) (value interface{}, ok bool)
- func (c *Cache) Purge()
- func (c *Cache) Remove(key interface{})
- func (c *Cache) RemoveOldest()
- type TwoQueueCache
- func (c *TwoQueueCache) Add(key, value interface{})
- func (c *TwoQueueCache) Contains(key interface{}) bool
- func (c *TwoQueueCache) Get(key interface{}) (value interface{}, ok bool)
- func (c *TwoQueueCache) Keys() []interface{}
- func (c *TwoQueueCache) Len() int
- func (c *TwoQueueCache) Peek(key interface{}) (value interface{}, ok bool)
- func (c *TwoQueueCache) Purge()
- func (c *TwoQueueCache) Remove(key interface{})
Constants ¶
const ( // Default2QRecentRatio is the ratio of the 2Q cache dedicated // to recently added entries that have only been accessed once. Default2QRecentRatio = 0.25 // Default2QGhostEntries is the default ratio of ghost // entries kept to track entries recently evicted Default2QGhostEntries = 0.50 )
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type ARCCache ¶
type ARCCache struct {
// contains filtered or unexported fields
}
ARCCache is a thread-safe fixed size Adaptive Replacement Cache (ARC). ARC is an enhancement over the standard LRU cache in that tracks both frequency and recency of use. This avoids a burst in access to new entries from evicting the frequently used older entries. It adds some additional tracking overhead to a standard LRU cache, computationally it is roughly 2x the cost, and the extra memory overhead is linear with the size of the cache. ARC has been patented by IBM, but is similar to the TwoQueueCache (2Q) which requires setting parameters.
func (*ARCCache) Add ¶
func (c *ARCCache) Add(key, value interface{})
Add adds a value to the cache.
func (*ARCCache) Contains ¶
Contains is used to check if the cache contains a key without updating recency or frequency.
func (*ARCCache) Get ¶
Get looks up a key's value from the cache.
func (*ARCCache) Peek ¶
Peek is used to inspect the cache value of a key without updating recency or frequency.
type Cache ¶
type Cache struct {
// contains filtered or unexported fields
}
Cache is a thread-safe fixed size LRU cache.
func NewWithEvict ¶
NewWithEvict constructs a fixed size cache with the given eviction callback.
func (*Cache) Add ¶
Add adds a value to the cache. Returns true if an eviction occurred.
func (*Cache) Contains ¶
Contains checks if a key is in the cache, without updating the recent-ness or deleting it for being stale.
func (*Cache) ContainsOrAdd ¶
ContainsOrAdd checks if a key is in the cache without updating the recent-ness or deleting it for being stale, and if not, adds the value. Returns whether found and whether an eviction occurred.
func (*Cache) Get ¶
Get looks up a key's value from the cache.
func (*Cache) Keys ¶
func (c *Cache) Keys() []interface{}
Keys returns a slice of the keys in the cache, from oldest to newest.
func (*Cache) Peek ¶
Peek returns the key value (or undefined if not found) without updating the "recently used"-ness of the key.
type TwoQueueCache ¶
type TwoQueueCache struct {
// contains filtered or unexported fields
}
TwoQueueCache is a thread-safe fixed size 2Q cache. 2Q is an enhancement over the standard LRU cache in that it tracks both frequently and recently used entries separately. This avoids a burst in access to new entries from evicting frequently used entries. It adds some additional tracking overhead to the standard LRU cache, and is computationally about 2x the cost, and adds some metadata over head. The ARCCache is similar, but does not require setting any parameters.
func New2Q ¶
func New2Q(size int) (*TwoQueueCache, error)
New2Q creates a new TwoQueueCache using the default values for the parameters.
func New2QParams ¶
func New2QParams(size int, recentRatio float64, ghostRatio float64) (*TwoQueueCache, error)
New2QParams creates a new TwoQueueCache using the provided parameter values.
func (*TwoQueueCache) Add ¶
func (c *TwoQueueCache) Add(key, value interface{})
Add adds a value to the cache.
func (*TwoQueueCache) Contains ¶
func (c *TwoQueueCache) Contains(key interface{}) bool
Contains is used to check if the cache contains a key without updating recency or frequency.
func (*TwoQueueCache) Get ¶
func (c *TwoQueueCache) Get(key interface{}) (value interface{}, ok bool)
Get looks up a key's value from the cache.
func (*TwoQueueCache) Keys ¶
func (c *TwoQueueCache) Keys() []interface{}
Keys returns a slice of the keys in the cache. The frequently used keys are first in the returned slice.
func (*TwoQueueCache) Len ¶
func (c *TwoQueueCache) Len() int
Len returns the number of items in the cache.
func (*TwoQueueCache) Peek ¶
func (c *TwoQueueCache) Peek(key interface{}) (value interface{}, ok bool)
Peek is used to inspect the cache value of a key without updating recency or frequency.
func (*TwoQueueCache) Purge ¶
func (c *TwoQueueCache) Purge()
Purge is used to completely clear the cache.
func (*TwoQueueCache) Remove ¶
func (c *TwoQueueCache) Remove(key interface{})
Remove removes the provided key from the cache.