Documentation ¶
Overview ¶
Package caching implements common server object caches.
Two caches are defined:
- A process-global LRU cache, which may retain data in between requests.
- A per-request cache, which can be installed into the Context that is servicing an individual request, and will be purged at the end of that request.
Index ¶
- Variables
- func RequestCache(c context.Context) *lru.Cache[string, any]
- func WithEmptyProcessCache(ctx context.Context) context.Context
- func WithGlobalCache(c context.Context, provider BlobCacheProvider) context.Context
- func WithProcessCacheData(ctx context.Context, data *ProcessCacheData) context.Context
- func WithRequestCache(c context.Context) context.Context
- type BlobCache
- type BlobCacheProvider
- type FetchCallback
- type LRUHandle
- type ProcessCacheData
- type SlotHandle
Constants ¶
This section is empty.
Variables ¶
var ErrCacheMiss = errors.New("cache miss")
ErrCacheMiss is returned by BlobCache.Get if the requested item is missing.
var ( // ErrNoProcessCache is returned by Fetch if the context doesn't have // ProcessCacheData. // // This usually happens in tests. Use WithEmptyProcessCache to prepare the // context. ErrNoProcessCache = errors.New("no process cache is installed in the context, use WithEmptyProcessCache") )
Functions ¶
func RequestCache ¶
RequestCache retrieves a per-request in-memory cache of the Context. If no request cache is installed, this will panic.
func WithEmptyProcessCache ¶
WithEmptyProcessCache installs an empty process-global cache storage into the context.
Useful in main() when initializing a root context (used as a basis for all other contexts) or in unit tests to "reset" the cache state.
Note that using WithEmptyProcessCache when initializing per-request context makes no sense, since each request will get its own cache. Instead allocate the storage cache area via NewProcessCacheData(), retain it in some global variable and install into per-request context via WithProcessCacheData.
func WithGlobalCache ¶
func WithGlobalCache(c context.Context, provider BlobCacheProvider) context.Context
WithGlobalCache installs a global cache implementation into the supplied context.
func WithProcessCacheData ¶
func WithProcessCacheData(ctx context.Context, data *ProcessCacheData) context.Context
WithProcessCacheData installs an existing process-global cache storage into the supplied context.
It must be allocated via NewProcessCacheData().
func WithRequestCache ¶
WithRequestCache initializes context-bound local cache and adds it to the Context.
The cache has unbounded size. This is fine, since the lifetime of the cache is still scoped to a single request.
It can be used as a second fast layer of caching in front of memcache. It is never trimmed, only released at once upon the request completion.
TODO(vadimsh): Get rid of it, there's only one caller of RequestCache(...).
Types ¶
type BlobCache ¶
type BlobCache interface { // Get returns a cached item or ErrCacheMiss if it's not in the cache. Get(c context.Context, key string) ([]byte, error) // Set unconditionally overwrites an item in the cache. // // If 'exp' is zero, the item will have no expiration time. Set(c context.Context, key string, value []byte, exp time.Duration) error }
BlobCache is a minimal interface for a cross-process memcache-like cache.
It exists to support low-level server libraries that target all sorts of environments that may have different memcache implementations (or none at all).
This interface provides the smallest sufficient API to support needs of low-level libraries. If you need anything fancier, consider using the concrete memcache implementation directly (e.g. use luci/gae's memcache library when running on GAE Standard).
BlobCache has the following properties:
- All service processes share this cache (thus 'global').
- The global cache is namespaced (see GlobalCache function below).
- It is strongly consistent.
- Items can be evicted at random times.
- Key size is <250 bytes.
- Item size is <1 Mb.
func GlobalCache ¶
GlobalCache returns a global cache targeting the given namespace or nil if the global cache is not available in the current environment.
Libraries that can live without a global cache mush handle its absence gracefully (e.g. by falling back to using only the process cache).
type BlobCacheProvider ¶
BlobCacheProvider returns a BlobCache instance targeting a namespace.
type FetchCallback ¶
FetchCallback knows how to grab a new value for the cache slot (if prev is nil) or refresh the known one (if prev is not nil).
If the returned expiration time is 0, the value is considered non-expirable. If the returned expiration time is <0, the value will be refetched on the next access. This is sometimes useful in tests that "freeze" time.
type LRUHandle ¶
type LRUHandle[K comparable, V any] struct { // contains filtered or unexported fields }
LRUHandle is indirect pointer to a registered LRU process cache.
Grab it via RegisterLRUCache during module init time, and use its LRU() method to access an actual LRU cache associated with this handle.
The cache itself lives inside a context. See WithProcessCacheData.
func RegisterLRUCache ¶
func RegisterLRUCache[K comparable, V any](capacity int) LRUHandle[K, V]
RegisterLRUCache is used during init time to declare an intent that a package wants to use a process-global LRU cache of given capacity (or 0 for unlimited).
The actual cache itself will be stored in ProcessCacheData inside a context.
type ProcessCacheData ¶
type ProcessCacheData struct {
// contains filtered or unexported fields
}
ProcessCacheData holds all process-cached data (internally).
It is opaque to the API users. Use NewProcessCacheData in your main() or below (i.e. any other place _other_ than init()) to allocate it, then inject it into the context via WithProcessCacheData, and finally access it through handles registered during init() time via RegisterLRUCache to get a reference to an actual lru.Cache.
Each instance of ProcessCacheData is its own universe of global data. This is useful in unit tests as replacement for global variables.
func NewProcessCacheData ¶
func NewProcessCacheData() *ProcessCacheData
NewProcessCacheData allocates and initializes all registered LRU caches.
It returns a fat stateful object that holds all the cached data. Retain it and share between requests etc. to actually benefit from the cache.
NewProcessCacheData must be called after init() time (either in main or code called from main).
type SlotHandle ¶
type SlotHandle struct {
// contains filtered or unexported fields
}
SlotHandle is indirect pointer to a registered process cache slot.
Such slot holds one arbitrary value, alongside its expiration time. Useful for representing global singletons that occasionally need to be refreshed.
Grab it via RegisterCacheSlot during module init time, and use its Fetch() method to access the value, potentially refreshing it, if necessary.
The value itself lives inside a context. See WithProcessCacheData.
func RegisterCacheSlot ¶
func RegisterCacheSlot() SlotHandle
RegisterCacheSlot is used during init time to preallocate a place for the cache global variable.
The actual cache itself will be stored in ProcessCacheData inside a context.
func (SlotHandle) Fetch ¶
func (h SlotHandle) Fetch(ctx context.Context, cb FetchCallback) (any, error)
Fetch returns the cached data, if it is available and fresh, or attempts to refresh it by calling the given callback.
Returns ErrNoProcessCache if the context doesn't have ProcessCacheData.
Directories ¶
Path | Synopsis |
---|---|
Package cachingtest contains helpers for testing code that uses caching package.
|
Package cachingtest contains helpers for testing code that uses caching package. |
Package layered provides a two-layer cache for serializable objects.
|
Package layered provides a two-layer cache for serializable objects. |