Documentation ¶
Index ¶
- Constants
- Variables
- type Cache
- func (c Cache) GetScraper(scraperID string) *models.Scraper
- func (c Cache) ListGalleryScrapers() []*models.Scraper
- func (c Cache) ListMovieScrapers() []*models.Scraper
- func (c Cache) ListPerformerScrapers() []*models.Scraper
- func (c Cache) ListSceneScrapers() []*models.Scraper
- func (c *Cache) ReloadScrapers() error
- func (c Cache) ScrapeGallery(scraperID string, galleryID int) (*models.ScrapedGallery, error)
- func (c Cache) ScrapeGalleryFragment(scraperID string, gallery models.ScrapedGalleryInput) (*models.ScrapedGallery, error)
- func (c Cache) ScrapeGalleryURL(url string) (*models.ScrapedGallery, error)
- func (c Cache) ScrapeMovieURL(url string) (*models.ScrapedMovie, error)
- func (c Cache) ScrapePerformer(scraperID string, scrapedPerformer models.ScrapedPerformerInput) (*models.ScrapedPerformer, error)
- func (c Cache) ScrapePerformerList(scraperID string, query string) ([]*models.ScrapedPerformer, error)
- func (c Cache) ScrapePerformerURL(url string) (*models.ScrapedPerformer, error)
- func (c Cache) ScrapeScene(scraperID string, sceneID int) (*models.ScrapedScene, error)
- func (c Cache) ScrapeSceneFragment(scraperID string, scene models.ScrapedSceneInput) (*models.ScrapedScene, error)
- func (c Cache) ScrapeSceneQuery(scraperID string, query string) ([]*models.ScrapedScene, error)
- func (c Cache) ScrapeSceneURL(url string) (*models.ScrapedScene, error)
- func (c *Cache) UpdateConfig(globalConfig GlobalConfig)
- type GlobalConfig
Constants ¶
const FreeonesScraperID = "builtin_freeones"
FreeonesScraperID is the scraper ID for the built-in Freeones scraper
Variables ¶
var ErrMaxRedirects = errors.New("maximum number of HTTP redirects reached")
Functions ¶
This section is empty.
Types ¶
type Cache ¶ added in v0.3.0
type Cache struct {
// contains filtered or unexported fields
}
Cache stores scraper details.
func NewCache ¶ added in v0.3.0
func NewCache(globalConfig GlobalConfig, txnManager models.TransactionManager) (*Cache, error)
NewCache returns a new Cache loading scraper configurations from the scraper path provided in the global config object. It returns a new instance and an error if the scraper directory could not be loaded.
Scraper configurations are loaded from yml files in the provided scrapers directory and any subdirectories.
func (Cache) GetScraper ¶ added in v0.11.0
GetScraper returns the scraper matching the provided id.
func (Cache) ListGalleryScrapers ¶ added in v0.4.0
ListGalleryScrapers returns a list of scrapers that are capable of scraping galleries.
func (Cache) ListMovieScrapers ¶ added in v0.3.0
ListMovieScrapers returns a list of scrapers that are capable of scraping scenes.
func (Cache) ListPerformerScrapers ¶ added in v0.3.0
ListPerformerScrapers returns a list of scrapers that are capable of scraping performers.
func (Cache) ListSceneScrapers ¶ added in v0.3.0
ListSceneScrapers returns a list of scrapers that are capable of scraping scenes.
func (*Cache) ReloadScrapers ¶ added in v0.3.0
ReloadScrapers clears the scraper cache and reloads from the scraper path. In the event of an error during loading, the cache will be left empty.
func (Cache) ScrapeGallery ¶ added in v0.4.0
ScrapeGallery uses the scraper with the provided ID to scrape a gallery using existing data.
func (Cache) ScrapeGalleryFragment ¶ added in v0.10.0
func (c Cache) ScrapeGalleryFragment(scraperID string, gallery models.ScrapedGalleryInput) (*models.ScrapedGallery, error)
ScrapeGalleryFragment uses the scraper with the provided ID to scrape a gallery.
func (Cache) ScrapeGalleryURL ¶ added in v0.4.0
func (c Cache) ScrapeGalleryURL(url string) (*models.ScrapedGallery, error)
ScrapeGalleryURL uses the first scraper it finds that matches the URL provided to scrape a scene. If no scrapers are found that matches the URL, then nil is returned.
func (Cache) ScrapeMovieURL ¶ added in v0.3.0
func (c Cache) ScrapeMovieURL(url string) (*models.ScrapedMovie, error)
ScrapeMovieURL uses the first scraper it finds that matches the URL provided to scrape a movie. If no scrapers are found that matches the URL, then nil is returned.
func (Cache) ScrapePerformer ¶ added in v0.3.0
func (c Cache) ScrapePerformer(scraperID string, scrapedPerformer models.ScrapedPerformerInput) (*models.ScrapedPerformer, error)
ScrapePerformer uses the scraper with the provided ID to scrape a performer using the provided performer fragment.
func (Cache) ScrapePerformerList ¶ added in v0.3.0
func (c Cache) ScrapePerformerList(scraperID string, query string) ([]*models.ScrapedPerformer, error)
ScrapePerformerList uses the scraper with the provided ID to query for performers using the provided query string. It returns a list of scraped performer data.
func (Cache) ScrapePerformerURL ¶ added in v0.3.0
func (c Cache) ScrapePerformerURL(url string) (*models.ScrapedPerformer, error)
ScrapePerformerURL uses the first scraper it finds that matches the URL provided to scrape a performer. If no scrapers are found that matches the URL, then nil is returned.
func (Cache) ScrapeScene ¶ added in v0.3.0
ScrapeScene uses the scraper with the provided ID to scrape a scene using existing data.
func (Cache) ScrapeSceneFragment ¶ added in v0.10.0
func (c Cache) ScrapeSceneFragment(scraperID string, scene models.ScrapedSceneInput) (*models.ScrapedScene, error)
ScrapeSceneFragment uses the scraper with the provided ID to scrape a scene.
func (Cache) ScrapeSceneQuery ¶ added in v0.10.0
ScrapeSceneQuery uses the scraper with the provided ID to query for scenes using the provided query string. It returns a list of scraped scene data.
func (Cache) ScrapeSceneURL ¶ added in v0.3.0
func (c Cache) ScrapeSceneURL(url string) (*models.ScrapedScene, error)
ScrapeSceneURL uses the first scraper it finds that matches the URL provided to scrape a scene. If no scrapers are found that matches the URL, then nil is returned.
func (*Cache) UpdateConfig ¶ added in v0.3.0
func (c *Cache) UpdateConfig(globalConfig GlobalConfig)
TODO - don't think this is needed UpdateConfig updates the global config for the cache. If the scraper path has changed, ReloadScrapers will need to be called separately.