crawler

package
v0.0.0-...-9831a62 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 19, 2021 License: GPL-3.0 Imports: 5 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type ConcurrentCrawler

type ConcurrentCrawler struct {
	// contains filtered or unexported fields
}

ConcurrentCrawler limits number of concurrent requests.

func WithConcurrencyLimit

func WithConcurrencyLimit(crawler crawler, limit int) *ConcurrentCrawler

WithConcurrencyLimit adds concurrency limit to the crawler.

func (*ConcurrentCrawler) Crawl

func (c *ConcurrentCrawler) Crawl(ctx context.Context, u *url.URL) ([]byte, error)

Crawl downloads a page.

type Crawler

type Crawler struct {
	// contains filtered or unexported fields
}

Crawler crawls web and downloads stuff.

func New

func New() *Crawler

New returns a new crawler.

func (*Crawler) Crawl

func (c *Crawler) Crawl(ctx context.Context, url *url.URL) ([]byte, error)

Crawl downloads a page.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL