loaders

package
v0.17.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 18, 2022 License: AGPL-3.0 Imports: 8 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type FileLoader

type FileLoader struct {
	// contains filtered or unexported fields
}

FileLoader batches and caches requests

func NewFileLoader

func NewFileLoader(config FileLoaderConfig) *FileLoader

NewFileLoader creates a new FileLoader given a fetch, wait, and maxBatch

func (*FileLoader) Clear

func (l *FileLoader) Clear(key file.ID)

Clear the value at key from the cache, if it exists

func (*FileLoader) Load

func (l *FileLoader) Load(key file.ID) (file.File, error)

Load a File by key, batching and caching will be applied automatically

func (*FileLoader) LoadAll

func (l *FileLoader) LoadAll(keys []file.ID) ([]file.File, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*FileLoader) LoadAllThunk

func (l *FileLoader) LoadAllThunk(keys []file.ID) func() ([]file.File, []error)

LoadAllThunk returns a function that when called will block waiting for a Files. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*FileLoader) LoadThunk

func (l *FileLoader) LoadThunk(key file.ID) func() (file.File, error)

LoadThunk returns a function that when called will block waiting for a File. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*FileLoader) Prime

func (l *FileLoader) Prime(key file.ID, value file.File) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type FileLoaderConfig

type FileLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []file.ID) ([]file.File, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

FileLoaderConfig captures the config to create a new FileLoader

type GalleryFileIDsLoader

type GalleryFileIDsLoader struct {
	// contains filtered or unexported fields
}

GalleryFileIDsLoader batches and caches requests

func NewGalleryFileIDsLoader

func NewGalleryFileIDsLoader(config GalleryFileIDsLoaderConfig) *GalleryFileIDsLoader

NewGalleryFileIDsLoader creates a new GalleryFileIDsLoader given a fetch, wait, and maxBatch

func (*GalleryFileIDsLoader) Clear

func (l *GalleryFileIDsLoader) Clear(key int)

Clear the value at key from the cache, if it exists

func (*GalleryFileIDsLoader) Load

func (l *GalleryFileIDsLoader) Load(key int) ([]file.ID, error)

Load a ID by key, batching and caching will be applied automatically

func (*GalleryFileIDsLoader) LoadAll

func (l *GalleryFileIDsLoader) LoadAll(keys []int) ([][]file.ID, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*GalleryFileIDsLoader) LoadAllThunk

func (l *GalleryFileIDsLoader) LoadAllThunk(keys []int) func() ([][]file.ID, []error)

LoadAllThunk returns a function that when called will block waiting for a IDs. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*GalleryFileIDsLoader) LoadThunk

func (l *GalleryFileIDsLoader) LoadThunk(key int) func() ([]file.ID, error)

LoadThunk returns a function that when called will block waiting for a ID. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*GalleryFileIDsLoader) Prime

func (l *GalleryFileIDsLoader) Prime(key int, value []file.ID) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type GalleryFileIDsLoaderConfig

type GalleryFileIDsLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int) ([][]file.ID, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

GalleryFileIDsLoaderConfig captures the config to create a new GalleryFileIDsLoader

type GalleryLoader

type GalleryLoader struct {
	// contains filtered or unexported fields
}

GalleryLoader batches and caches requests

func NewGalleryLoader

func NewGalleryLoader(config GalleryLoaderConfig) *GalleryLoader

NewGalleryLoader creates a new GalleryLoader given a fetch, wait, and maxBatch

func (*GalleryLoader) Clear

func (l *GalleryLoader) Clear(key int)

Clear the value at key from the cache, if it exists

func (*GalleryLoader) Load

func (l *GalleryLoader) Load(key int) (*models.Gallery, error)

Load a Gallery by key, batching and caching will be applied automatically

func (*GalleryLoader) LoadAll

func (l *GalleryLoader) LoadAll(keys []int) ([]*models.Gallery, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*GalleryLoader) LoadAllThunk

func (l *GalleryLoader) LoadAllThunk(keys []int) func() ([]*models.Gallery, []error)

LoadAllThunk returns a function that when called will block waiting for a Gallerys. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*GalleryLoader) LoadThunk

func (l *GalleryLoader) LoadThunk(key int) func() (*models.Gallery, error)

LoadThunk returns a function that when called will block waiting for a Gallery. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*GalleryLoader) Prime

func (l *GalleryLoader) Prime(key int, value *models.Gallery) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type GalleryLoaderConfig

type GalleryLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int) ([]*models.Gallery, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

GalleryLoaderConfig captures the config to create a new GalleryLoader

type ImageFileIDsLoader

type ImageFileIDsLoader struct {
	// contains filtered or unexported fields
}

ImageFileIDsLoader batches and caches requests

func NewImageFileIDsLoader

func NewImageFileIDsLoader(config ImageFileIDsLoaderConfig) *ImageFileIDsLoader

NewImageFileIDsLoader creates a new ImageFileIDsLoader given a fetch, wait, and maxBatch

func (*ImageFileIDsLoader) Clear

func (l *ImageFileIDsLoader) Clear(key int)

Clear the value at key from the cache, if it exists

func (*ImageFileIDsLoader) Load

func (l *ImageFileIDsLoader) Load(key int) ([]file.ID, error)

Load a ID by key, batching and caching will be applied automatically

func (*ImageFileIDsLoader) LoadAll

func (l *ImageFileIDsLoader) LoadAll(keys []int) ([][]file.ID, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*ImageFileIDsLoader) LoadAllThunk

func (l *ImageFileIDsLoader) LoadAllThunk(keys []int) func() ([][]file.ID, []error)

LoadAllThunk returns a function that when called will block waiting for a IDs. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*ImageFileIDsLoader) LoadThunk

func (l *ImageFileIDsLoader) LoadThunk(key int) func() ([]file.ID, error)

LoadThunk returns a function that when called will block waiting for a ID. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*ImageFileIDsLoader) Prime

func (l *ImageFileIDsLoader) Prime(key int, value []file.ID) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type ImageFileIDsLoaderConfig

type ImageFileIDsLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int) ([][]file.ID, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

ImageFileIDsLoaderConfig captures the config to create a new ImageFileIDsLoader

type ImageLoader

type ImageLoader struct {
	// contains filtered or unexported fields
}

ImageLoader batches and caches requests

func NewImageLoader

func NewImageLoader(config ImageLoaderConfig) *ImageLoader

NewImageLoader creates a new ImageLoader given a fetch, wait, and maxBatch

func (*ImageLoader) Clear

func (l *ImageLoader) Clear(key int)

Clear the value at key from the cache, if it exists

func (*ImageLoader) Load

func (l *ImageLoader) Load(key int) (*models.Image, error)

Load a Image by key, batching and caching will be applied automatically

func (*ImageLoader) LoadAll

func (l *ImageLoader) LoadAll(keys []int) ([]*models.Image, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*ImageLoader) LoadAllThunk

func (l *ImageLoader) LoadAllThunk(keys []int) func() ([]*models.Image, []error)

LoadAllThunk returns a function that when called will block waiting for a Images. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*ImageLoader) LoadThunk

func (l *ImageLoader) LoadThunk(key int) func() (*models.Image, error)

LoadThunk returns a function that when called will block waiting for a Image. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*ImageLoader) Prime

func (l *ImageLoader) Prime(key int, value *models.Image) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type ImageLoaderConfig

type ImageLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int) ([]*models.Image, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

ImageLoaderConfig captures the config to create a new ImageLoader

type Loaders

type Loaders struct {
	SceneByID    *SceneLoader
	SceneFiles   *SceneFileIDsLoader
	ImageFiles   *ImageFileIDsLoader
	GalleryFiles *GalleryFileIDsLoader

	GalleryByID   *GalleryLoader
	ImageByID     *ImageLoader
	PerformerByID *PerformerLoader
	StudioByID    *StudioLoader
	TagByID       *TagLoader
	MovieByID     *MovieLoader
	FileByID      *FileLoader
}

func From

func From(ctx context.Context) Loaders

type Middleware

type Middleware struct {
	DatabaseProvider txn.DatabaseProvider
	Repository       manager.Repository
}

func (Middleware) Middleware

func (m Middleware) Middleware(next http.Handler) http.Handler

type MovieLoader

type MovieLoader struct {
	// contains filtered or unexported fields
}

MovieLoader batches and caches requests

func NewMovieLoader

func NewMovieLoader(config MovieLoaderConfig) *MovieLoader

NewMovieLoader creates a new MovieLoader given a fetch, wait, and maxBatch

func (*MovieLoader) Clear

func (l *MovieLoader) Clear(key int)

Clear the value at key from the cache, if it exists

func (*MovieLoader) Load

func (l *MovieLoader) Load(key int) (*models.Movie, error)

Load a Movie by key, batching and caching will be applied automatically

func (*MovieLoader) LoadAll

func (l *MovieLoader) LoadAll(keys []int) ([]*models.Movie, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*MovieLoader) LoadAllThunk

func (l *MovieLoader) LoadAllThunk(keys []int) func() ([]*models.Movie, []error)

LoadAllThunk returns a function that when called will block waiting for a Movies. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*MovieLoader) LoadThunk

func (l *MovieLoader) LoadThunk(key int) func() (*models.Movie, error)

LoadThunk returns a function that when called will block waiting for a Movie. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*MovieLoader) Prime

func (l *MovieLoader) Prime(key int, value *models.Movie) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type MovieLoaderConfig

type MovieLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int) ([]*models.Movie, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

MovieLoaderConfig captures the config to create a new MovieLoader

type PerformerLoader

type PerformerLoader struct {
	// contains filtered or unexported fields
}

PerformerLoader batches and caches requests

func NewPerformerLoader

func NewPerformerLoader(config PerformerLoaderConfig) *PerformerLoader

NewPerformerLoader creates a new PerformerLoader given a fetch, wait, and maxBatch

func (*PerformerLoader) Clear

func (l *PerformerLoader) Clear(key int)

Clear the value at key from the cache, if it exists

func (*PerformerLoader) Load

func (l *PerformerLoader) Load(key int) (*models.Performer, error)

Load a Performer by key, batching and caching will be applied automatically

func (*PerformerLoader) LoadAll

func (l *PerformerLoader) LoadAll(keys []int) ([]*models.Performer, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*PerformerLoader) LoadAllThunk

func (l *PerformerLoader) LoadAllThunk(keys []int) func() ([]*models.Performer, []error)

LoadAllThunk returns a function that when called will block waiting for a Performers. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*PerformerLoader) LoadThunk

func (l *PerformerLoader) LoadThunk(key int) func() (*models.Performer, error)

LoadThunk returns a function that when called will block waiting for a Performer. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*PerformerLoader) Prime

func (l *PerformerLoader) Prime(key int, value *models.Performer) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type PerformerLoaderConfig

type PerformerLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int) ([]*models.Performer, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

PerformerLoaderConfig captures the config to create a new PerformerLoader

type SceneFileIDsLoader

type SceneFileIDsLoader struct {
	// contains filtered or unexported fields
}

SceneFileIDsLoader batches and caches requests

func NewSceneFileIDsLoader

func NewSceneFileIDsLoader(config SceneFileIDsLoaderConfig) *SceneFileIDsLoader

NewSceneFileIDsLoader creates a new SceneFileIDsLoader given a fetch, wait, and maxBatch

func (*SceneFileIDsLoader) Clear

func (l *SceneFileIDsLoader) Clear(key int)

Clear the value at key from the cache, if it exists

func (*SceneFileIDsLoader) Load

func (l *SceneFileIDsLoader) Load(key int) ([]file.ID, error)

Load a ID by key, batching and caching will be applied automatically

func (*SceneFileIDsLoader) LoadAll

func (l *SceneFileIDsLoader) LoadAll(keys []int) ([][]file.ID, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*SceneFileIDsLoader) LoadAllThunk

func (l *SceneFileIDsLoader) LoadAllThunk(keys []int) func() ([][]file.ID, []error)

LoadAllThunk returns a function that when called will block waiting for a IDs. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*SceneFileIDsLoader) LoadThunk

func (l *SceneFileIDsLoader) LoadThunk(key int) func() ([]file.ID, error)

LoadThunk returns a function that when called will block waiting for a ID. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*SceneFileIDsLoader) Prime

func (l *SceneFileIDsLoader) Prime(key int, value []file.ID) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type SceneFileIDsLoaderConfig

type SceneFileIDsLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int) ([][]file.ID, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

SceneFileIDsLoaderConfig captures the config to create a new SceneFileIDsLoader

type SceneLoader

type SceneLoader struct {
	// contains filtered or unexported fields
}

SceneLoader batches and caches requests

func NewSceneLoader

func NewSceneLoader(config SceneLoaderConfig) *SceneLoader

NewSceneLoader creates a new SceneLoader given a fetch, wait, and maxBatch

func (*SceneLoader) Clear

func (l *SceneLoader) Clear(key int)

Clear the value at key from the cache, if it exists

func (*SceneLoader) Load

func (l *SceneLoader) Load(key int) (*models.Scene, error)

Load a Scene by key, batching and caching will be applied automatically

func (*SceneLoader) LoadAll

func (l *SceneLoader) LoadAll(keys []int) ([]*models.Scene, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*SceneLoader) LoadAllThunk

func (l *SceneLoader) LoadAllThunk(keys []int) func() ([]*models.Scene, []error)

LoadAllThunk returns a function that when called will block waiting for a Scenes. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*SceneLoader) LoadThunk

func (l *SceneLoader) LoadThunk(key int) func() (*models.Scene, error)

LoadThunk returns a function that when called will block waiting for a Scene. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*SceneLoader) Prime

func (l *SceneLoader) Prime(key int, value *models.Scene) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type SceneLoaderConfig

type SceneLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int) ([]*models.Scene, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

SceneLoaderConfig captures the config to create a new SceneLoader

type StudioLoader

type StudioLoader struct {
	// contains filtered or unexported fields
}

StudioLoader batches and caches requests

func NewStudioLoader

func NewStudioLoader(config StudioLoaderConfig) *StudioLoader

NewStudioLoader creates a new StudioLoader given a fetch, wait, and maxBatch

func (*StudioLoader) Clear

func (l *StudioLoader) Clear(key int)

Clear the value at key from the cache, if it exists

func (*StudioLoader) Load

func (l *StudioLoader) Load(key int) (*models.Studio, error)

Load a Studio by key, batching and caching will be applied automatically

func (*StudioLoader) LoadAll

func (l *StudioLoader) LoadAll(keys []int) ([]*models.Studio, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*StudioLoader) LoadAllThunk

func (l *StudioLoader) LoadAllThunk(keys []int) func() ([]*models.Studio, []error)

LoadAllThunk returns a function that when called will block waiting for a Studios. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*StudioLoader) LoadThunk

func (l *StudioLoader) LoadThunk(key int) func() (*models.Studio, error)

LoadThunk returns a function that when called will block waiting for a Studio. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*StudioLoader) Prime

func (l *StudioLoader) Prime(key int, value *models.Studio) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type StudioLoaderConfig

type StudioLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int) ([]*models.Studio, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

StudioLoaderConfig captures the config to create a new StudioLoader

type TagLoader

type TagLoader struct {
	// contains filtered or unexported fields
}

TagLoader batches and caches requests

func NewTagLoader

func NewTagLoader(config TagLoaderConfig) *TagLoader

NewTagLoader creates a new TagLoader given a fetch, wait, and maxBatch

func (*TagLoader) Clear

func (l *TagLoader) Clear(key int)

Clear the value at key from the cache, if it exists

func (*TagLoader) Load

func (l *TagLoader) Load(key int) (*models.Tag, error)

Load a Tag by key, batching and caching will be applied automatically

func (*TagLoader) LoadAll

func (l *TagLoader) LoadAll(keys []int) ([]*models.Tag, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*TagLoader) LoadAllThunk

func (l *TagLoader) LoadAllThunk(keys []int) func() ([]*models.Tag, []error)

LoadAllThunk returns a function that when called will block waiting for a Tags. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*TagLoader) LoadThunk

func (l *TagLoader) LoadThunk(key int) func() (*models.Tag, error)

LoadThunk returns a function that when called will block waiting for a Tag. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*TagLoader) Prime

func (l *TagLoader) Prime(key int, value *models.Tag) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type TagLoaderConfig

type TagLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int) ([]*models.Tag, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

TagLoaderConfig captures the config to create a new TagLoader

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL