Documentation ¶
Overview ¶
Package lru provides three different LRU caches of varying sophistication.
Cache is a simple LRU cache. It is based on the LRU implementation in groupcache: https://github.com/golang/groupcache/tree/master/lru
TwoQueueCache tracks frequently used and recently used entries separately. This avoids a burst of accesses from taking out frequently used entries, at the cost of about 2x computational overhead and some extra bookkeeping.
ARCCache is an adaptive replacement cache. It tracks recent evictions as well as recent usage in both the frequent and recent caches. Its computational overhead is comparable to TwoQueueCache, but the memory overhead is linear with the size of the cache.
ARC has been patented by IBM, so do not use it if that is problematic for your program.
All caches in this package take locks while operating, and are therefore thread-safe for consumers.
Index ¶
- Constants
- type ARCCache
- func (c *ARCCache[K, V]) Add(key K, value V)
- func (c *ARCCache[K, V]) Contains(key K) bool
- func (c *ARCCache[K, V]) Get(key K) (value V, ok bool)
- func (c *ARCCache[K, V]) Keys() []K
- func (c *ARCCache[K, V]) Len() int
- func (c *ARCCache[K, V]) Peek(key K) (value V, ok bool)
- func (c *ARCCache[K, V]) Purge()
- func (c *ARCCache[K, V]) Remove(key K)
- type Cache
- func (c *Cache[K, V]) Add(key K, value V) (evicted bool)
- func (c *Cache[K, V]) Contains(key K) bool
- func (c *Cache[K, V]) ContainsOrAdd(key K, value V) (ok, evicted bool)
- func (c *Cache[K, V]) Get(key K) (value V, ok bool)
- func (c *Cache[K, V]) GetOldest() (key K, value V, ok bool)
- func (c *Cache[K, V]) Keys() []K
- func (c *Cache[K, V]) Len() int
- func (c *Cache[K, V]) Peek(key K) (value V, ok bool)
- func (c *Cache[K, V]) PeekOrAdd(key K, value V) (previous V, ok, evicted bool)
- func (c *Cache[K, V]) Purge()
- func (c *Cache[K, V]) Remove(key K) (present bool)
- func (c *Cache[K, V]) RemoveOldest() (key K, value V, ok bool)
- func (c *Cache[K, V]) Resize(size int) (evicted int)
- type TwoQueueCache
- func (c *TwoQueueCache[K, V]) Add(key K, value V)
- func (c *TwoQueueCache[K, V]) Contains(key K) bool
- func (c *TwoQueueCache[K, V]) Get(key K) (value V, ok bool)
- func (c *TwoQueueCache[K, V]) Keys() []K
- func (c *TwoQueueCache[K, V]) Len() int
- func (c *TwoQueueCache[K, V]) Peek(key K) (value V, ok bool)
- func (c *TwoQueueCache[K, V]) Purge()
- func (c *TwoQueueCache[K, V]) Remove(key K)
Constants ¶
const ( // Default2QRecentRatio is the ratio of the 2Q cache dedicated // to recently added entries that have only been accessed once. Default2QRecentRatio = 0.25 // Default2QGhostEntries is the default ratio of ghost // entries kept to track entries recently evicted Default2QGhostEntries = 0.50 )
const (
// DefaultEvictedBufferSize defines the default buffer size to store evicted key/val
DefaultEvictedBufferSize = 16
)
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type ARCCache ¶
type ARCCache[K comparable, V any] struct { // contains filtered or unexported fields }
ARCCache is a thread-safe fixed size Adaptive Replacement Cache (ARC). ARC is an enhancement over the standard LRU cache in that tracks both frequency and recency of use. This avoids a burst in access to new entries from evicting the frequently used older entries. It adds some additional tracking overhead to a standard LRU cache, computationally it is roughly 2x the cost, and the extra memory overhead is linear with the size of the cache. ARC has been patented by IBM, but is similar to the TwoQueueCache (2Q) which requires setting parameters.
func NewARC ¶
func NewARC[K comparable, V any](size int) (*ARCCache[K, V], error)
NewARC creates an ARC of the given size
func (*ARCCache[K, V]) Add ¶
func (c *ARCCache[K, V]) Add(key K, value V)
Add adds a value to the cache.
func (*ARCCache[K, V]) Contains ¶
Contains is used to check if the cache contains a key without updating recency or frequency.
type Cache ¶
type Cache[K comparable, V any] struct { // contains filtered or unexported fields }
Cache is a thread-safe fixed size LRU cache.
func New ¶
func New[K comparable, V any](size int) (*Cache[K, V], error)
New creates an LRU of the given size.
func NewWithEvict ¶
func NewWithEvict[K comparable, V any](size int, onEvicted func(key K, value V)) (c *Cache[K, V], err error)
NewWithEvict constructs a fixed size cache with the given eviction callback.
func (*Cache[K, V]) Contains ¶
Contains checks if a key is in the cache, without updating the recent-ness or deleting it for being stale.
func (*Cache[K, V]) ContainsOrAdd ¶
ContainsOrAdd checks if a key is in the cache without updating the recent-ness or deleting it for being stale, and if not, adds the value. Returns whether found and whether an eviction occurred.
func (*Cache[K, V]) Keys ¶
func (c *Cache[K, V]) Keys() []K
Keys returns a slice of the keys in the cache, from oldest to newest.
func (*Cache[K, V]) Peek ¶
Peek returns the key value (or undefined if not found) without updating the "recently used"-ness of the key.
func (*Cache[K, V]) PeekOrAdd ¶
PeekOrAdd checks if a key is in the cache without updating the recent-ness or deleting it for being stale, and if not, adds the value. Returns whether found and whether an eviction occurred.
func (*Cache[K, V]) Purge ¶
func (c *Cache[K, V]) Purge()
Purge is used to completely clear the cache.
func (*Cache[K, V]) RemoveOldest ¶
RemoveOldest removes the oldest item from the cache.
type TwoQueueCache ¶
type TwoQueueCache[K comparable, V any] struct { // contains filtered or unexported fields }
TwoQueueCache is a thread-safe fixed size 2Q cache. 2Q is an enhancement over the standard LRU cache in that it tracks both frequently and recently used entries separately. This avoids a burst in access to new entries from evicting frequently used entries. It adds some additional tracking overhead to the standard LRU cache, and is computationally about 2x the cost, and adds some metadata over head. The ARCCache is similar, but does not require setting any parameters.
func New2Q ¶
func New2Q[K comparable, V any](size int) (*TwoQueueCache[K, V], error)
New2Q creates a new TwoQueueCache using the default values for the parameters.
func New2QParams ¶
func New2QParams[K comparable, V any](size int, recentRatio, ghostRatio float64) (*TwoQueueCache[K, V], error)
New2QParams creates a new TwoQueueCache using the provided parameter values.
func (*TwoQueueCache[K, V]) Add ¶
func (c *TwoQueueCache[K, V]) Add(key K, value V)
Add adds a value to the cache.
func (*TwoQueueCache[K, V]) Contains ¶
func (c *TwoQueueCache[K, V]) Contains(key K) bool
Contains is used to check if the cache contains a key without updating recency or frequency.
func (*TwoQueueCache[K, V]) Get ¶
func (c *TwoQueueCache[K, V]) Get(key K) (value V, ok bool)
Get looks up a key's value from the cache.
func (*TwoQueueCache[K, V]) Keys ¶
func (c *TwoQueueCache[K, V]) Keys() []K
Keys returns a slice of the keys in the cache. The frequently used keys are first in the returned slice.
func (*TwoQueueCache[K, V]) Len ¶
func (c *TwoQueueCache[K, V]) Len() int
Len returns the number of items in the cache.
func (*TwoQueueCache[K, V]) Peek ¶
func (c *TwoQueueCache[K, V]) Peek(key K) (value V, ok bool)
Peek is used to inspect the cache value of a key without updating recency or frequency.
func (*TwoQueueCache[K, V]) Purge ¶
func (c *TwoQueueCache[K, V]) Purge()
Purge is used to completely clear the cache.
func (*TwoQueueCache[K, V]) Remove ¶
func (c *TwoQueueCache[K, V]) Remove(key K)
Remove removes the provided key from the cache.