cache

package
v0.0.0-...-8f1101b Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 26, 2024 License: Apache-2.0 Imports: 4 Imported by: 2

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Callbacks

type Callbacks struct {
	LookupsCallback         EventCallback
	HitsCallback            EventCallback
	MissesCallback          EventCallback
	ForcedEvictionsCallback simplelru.EvictCallback
	ManualEvictionsCallback EventCallback
}

Callbacks stores various callbacks that may fire during the lifetime of an LRUCache.

NOTE: You must make sure that your callbacks are able to return quickly, because having slow callbacks will result in degraded cache performance (because the cache invokes your callbacks synchronously). The reason why we do this synchronously (and not invoke callbacks in a separate goroutine ourselves) is because we want to give users the flexibility to do that themselves. Hard-coding in a `go ...` invocation in our callback call sites would risk unnecessarily costing performance if the callbacks themselves are already optimized to return quickly.

type EventCallback

type EventCallback func(key interface{})

EventCallback is similar to simplelru.EvictCallback, except that it doesn't take a value argument.

type LRUCache

type LRUCache struct {
	*sync.Mutex
	*simplelru.LRU
	// contains filtered or unexported fields
}

LRUCache is the actual concurrent non-blocking cache.

func NewLRUCache

func NewLRUCache(size int,
	callbacks Callbacks) (*LRUCache, error)

NewLRUCache returns a new LRUCache with a given size (number of elements). The forcedEvictionsCallback is a function that is called when an eviction occurs in the underlying cache.

func (*LRUCache) GetOrAdd

func (lruCache *LRUCache) GetOrAdd(
	key interface{},
	valConstructor ValConstructor) (interface{}, bool, error)

GetOrAdd tries to use a cache if it is available to get a Value. It is assumed that Value is expensive to construct from scratch, which is the reason why we try to use the cache in the first place. If we do end up constructing a Value from scratch, we store it into the cache with a corresponding key, so that we can look up the Value with just the key in the future.

This cache is resistant to cache stampedes because it uses a duplicate suppression strategy. This is also called request coalescing.

type Promise

type Promise struct {
	// contains filtered or unexported fields
}

Promise is a wrapper around cache value construction; it is used to synchronize the to-be-cached value between the first thread that undergoes a cache miss and subsequent threads that attempt to look up the same cache entry (cache hit). When the Promise is resolved (when the "valConstructionPending" channel is closed), the value is ready for concurrent reads.

type ValConstructor

type ValConstructor func() (interface{}, error)

ValConstructor is used to construct a value. The assumption is that this ValConstructor is expensive to compute, and that we need to memoize it via the LRUCache. The raw values of a cache are only constructed after a cache miss (and only the first cache miss). Using this type allows us to use any arbitrary function whose resulting value needs to be memoized (saved in the cache). This type also allows us to delay running the expensive computation until we actually need it (after a cache miss).

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL