Documentation
¶
Index ¶
- Constants
- func FeedCrawler(crawlRequests chan *feedwatcher.FeedCrawlRequest)
- func GetFeed(url string, client *http.Client) (*http.Response, error)
- func GetFeedAndMakeResponse(url string, client *http.Client) *feedwatcher.FeedCrawlResponse
- func StartCrawlerPool(num int, crawlChannel chan *feedwatcher.FeedCrawlRequest)
Constants ¶
View Source
const Accept = "application/rss+xml, application/rdf+xml;q=0.8, application/atom+xml;q=0.6, application/xml;q=0.4, text/xml;q=0.4"
View Source
const UserAgent = "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36"
Variables ¶
This section is empty.
Functions ¶
func FeedCrawler ¶
func FeedCrawler(crawlRequests chan *feedwatcher.FeedCrawlRequest)
FeedCrawler pulls FeedCrawlRequests from the crawl_requests channel, gets the given URL and returns a response
func GetFeed ¶
GetFeed gets a URL and returns a http.Response. Sets a reasonable timeout on the connection and read from the server. Users will need to Close() the resposne.Body or risk leaking connections.
func GetFeedAndMakeResponse ¶
func GetFeedAndMakeResponse(url string, client *http.Client) *feedwatcher.FeedCrawlResponse
GetFeedAndMakeResponse gets a URL and returns a FeedCrawlResponse Sets FeedCrawlResponse.Error if there was a problem retreiving the URL.
func StartCrawlerPool ¶
func StartCrawlerPool(num int, crawlChannel chan *feedwatcher.FeedCrawlRequest)
StartCrawlerPool creates a pool of num http crawlers listening to the crawl_channel.
Types ¶
This section is empty.
Click to show internal directories.
Click to hide internal directories.