Documentation ¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
Types ¶
type JobInfo ¶
type JobInfo struct { // Config is the origin scrape config in config file Config *config.ScrapeConfig // Cli is the http.Cli for scraping // all scraping request will be proxy to env SCRAPE_PROXY if it is not empty Cli *http.Client // contains filtered or unexported fields }
JobInfo contains http client for scraping target, and the origin scrape config
type Manager ¶
type Manager struct {
// contains filtered or unexported fields
}
Manager includes all jobs
func (*Manager) ApplyConfig ¶
func (s *Manager) ApplyConfig(cfg *prom.ConfigInfo) error
ApplyConfig update Manager from config
type Scraper ¶ added in v0.2.0
type Scraper struct { // HTTPResponse save the http response when RequestTo is called HTTPResponse *http.Response // contains filtered or unexported fields }
Scraper do one scraping must call RequestTo befor ParseResponse
func NewScraper ¶ added in v0.2.0
func NewScraper(job *JobInfo, url string, log logrus.FieldLogger) *Scraper
NewScraper create a new Scraper
func (*Scraper) ParseResponse ¶ added in v0.2.0
ParseResponse parse metrics RequestTo must be called before ParseResponse
func (*Scraper) RequestTo ¶ added in v0.2.0
RequestTo do http request to target response will be saved to s.HTTPResponse ParseResponse must be called if RequestTo return nil error
func (*Scraper) WithRawWriter ¶ added in v0.2.0
WithRawWriter add writers data will be copy to writers when ParseResponse is processing gziped data will be decoded before write to writer
Click to show internal directories.
Click to hide internal directories.