Documentation ¶
Index ¶
Constants ¶
View Source
const ( StatusFinish = "finish" StatusRunning = "running" StatusNoSpiders = "no_spiders" )
Variables ¶
View Source
var (
ErrTrackIsNil = errors.New("tracking number is nil")
)
Functions ¶
This section is empty.
Types ¶
type Manager ¶
type Manager struct { SpiderFinder SpiderFinder // contains filtered or unexported fields }
Manager creates Crawler instances for starts spiders. Crawler executes N tasks in goroutines and collect results.
func NewCrawlerManager ¶
func (*Manager) Start ¶
Start can start spiders in parallel and wait for all spiders to finish. For each spider we have to create new args for scraper with tracking number for replace in url or body or headers, etc...
After scraping, we can get result from scraper and save it to results. If we have error, we can save it to results too.
One spider can find another tracking number in result and start new spider for this tracking number.
type SpiderFinder ¶
SpiderFinder can found spiders by tracking number regexp.
Click to show internal directories.
Click to hide internal directories.