reqcache

package module
v0.0.0-...-7b5ae36 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 26, 2019 License: BSD-3-Clause Imports: 3 Imported by: 10

README

LRU Cache with Request Management

GoDoc Go Report Card

This implements an LRU cache for a I/O resource. On a cache miss, the cache requests the resource, blocking until it is available. The cache has the property that if multiple requests are concurrently made for a resource that is not in the cache, then only one request for the resource is made, and all requests block until it is available.

Because this library is designed for caching resources that require an I/O to fetch, it is fully concurrent. Fetches are made without holding any locks, meaning that:

  1. While one operation is blocking due to a cache miss, additional operations on the cache will not block.
  2. Multiple fetches for resources can be concurrently pending.

This library is also context-aware.

License

This library is licensed under the BSD 3-Clause License, but can also be licensed under the GNU GPL v3 License and freely used in projects licensed as GNU GPL v3.

Documentation

Overview

Package reqcache provides an LRU cache with request management. See the Github page for more information.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type CacheStatus

type CacheStatus int

CacheStatus describes the presence of an item in the cache.

const (
	// Present status indicates that the specified item is in the cache.
	Present CacheStatus = iota

	// Pending status indicates that the specified item is not in the cache,
	// but that there is a pending request for the item.
	Pending CacheStatus = iota

	// Missing status indicates that the specified item is not in the cache,
	// and that there is no outstanding request for the item.
	Missing CacheStatus = iota
)

type LRUCache

type LRUCache struct {
	// contains filtered or unexported fields
}

LRUCache represents a cache with the LRU eviction policy

func NewLRUCache

func NewLRUCache(capacity uint64, fetch func(ctx context.Context, key interface{}) (interface{}, uint64, error), onEvict func(evicted []*LRUCacheEntry)) *LRUCache

NewLRUCache returns a new instance of LRUCache.

capacity is the capacity of the cache. If the sum of the sizes of elements in the cache exceeds the capacity, the least recently used elements are evicted from the cache.

fetch is a function that is called on cache misses to fetch the element that is missing in the cache. The key that missed in the cache is passed as the argument. The function should return the corresponding value, and the size of the result (used to make sure that the total size does not exceed the cache's capacity). It can also return an error, in which case the result is not cached and the error is propagated to callers of Get(). No locks are held when fetch is called, so it is suitable to do blocking operations to fetch data. A context is also passed in, that is completely unrelated to the context passed to Get(). This context may be cancelled if no pending calls to get are interested in the result, which may happen if the contexts of all requesting goroutines time out, or if Put() is caled while the request is being fetched.

onEvict is a function that is called whenever elements are evicted from the cache according to the LRU replacement policy. It is called with the key-value pairs representing the evicted elements passed as arguments. It is not called with locks held, so it can perform blocking operations or even interact with this cache. It can be set to nil if the onEvict callback is not needed.

func (*LRUCache) Evict

func (lruc *LRUCache) Evict(key interface{}) bool

Evict an entry from the cache.

func (*LRUCache) Get

func (lruc *LRUCache) Get(ctx context.Context, key interface{}) (interface{}, error)

Get returns the value corresponding to the specialized key, caching the result. Returns an error if and only if there was a cache miss and the provided fetch() function returned an error. If Put() is called while a fetch is blocking, then the result of the fetch is thrown away and the value specified by Put() is returned.

func (*LRUCache) Invalidate

func (lruc *LRUCache) Invalidate()

Invalidate empties the cache, calling the onEvict callback as appropriate.

func (*LRUCache) Put

func (lruc *LRUCache) Put(key interface{}, value interface{}, size uint64) bool

Put inserts an entry with a known value into the cache.

func (*LRUCache) SetCapacity

func (lruc *LRUCache) SetCapacity(capacity uint64)

SetCapacity sets the capacity of the cache, evicting elements if necessary.

func (*LRUCache) TryGet

func (lruc *LRUCache) TryGet(key interface{}) (interface{}, CacheStatus)

TryGet checks if an element is present in the cache. The second return value describes the presence of the item in the cache; if it is Present, then the value corresponding to the provided key is provided in the first return value.

type LRUCacheEntry

type LRUCacheEntry struct {
	// Key is the key of the key-value pair represented by this entry.
	Key interface{}

	// Value is the value of the key-value pair represented by this entry.
	Value interface{}
	// contains filtered or unexported fields
}

LRUCacheEntry represents an entry in the LRU Cache. The size of this struct is the overhead of an entry existing in the cache. You should not have to actually use it.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL