Server

package
v0.0.0-...-8c873f1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 25, 2024 License: MIT Imports: 7 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type CrawlControl

type CrawlControl struct {
	State CrawlState
	// contains filtered or unexported fields
}

CrawlControl is the struct that minds a particular crawler.

type CrawlServer

type CrawlServer struct {
	// contains filtered or unexported fields
}

CrawlServer defines the struct that holds the status of crawls

func New

func New(f Fetcher) *CrawlServer

New creates and returns an empty CrawlServer.

func (*CrawlServer) CrawlResult

func (c *CrawlServer) CrawlResult(req *crawl.URLRequest, stream crawl.Crawl_CrawlResultServer) error

CrawlResult sends the status of a given URL back over gRPC.

func (*CrawlServer) CrawlSite

func (c *CrawlServer) CrawlSite(ctx context.Context, req *crawl.URLRequest) (*crawl.URLState, error)

CrawlSite starts, stops, or checks the status of a site.

func (*CrawlServer) Pause

func (c *CrawlServer) Pause(url string) (string, CrawlState, error)

Pause pauses a crawl for a URL.

func (*CrawlServer) Probe

func (c *CrawlServer) Probe(url string) string

Probe checks the current state of a crawl without changing anything.

func (*CrawlServer) Show

func (c *CrawlServer) Show(url string) string

Show translates the crawl tree into a string and returns it. XXX: Note that this forces the output into a fixed format,

but since this is for the CLI, we can live with it for now.
Otherwise we need to extend the gotree interface and add
a custom formatter for JSON or whatever. (YAGNI)

func (*CrawlServer) Start

func (c *CrawlServer) Start(url string) (string, CrawlState, error)

Start starts a crawl for a URL.

type CrawlState

type CrawlState int

CrawlState defines the crawler states we'll maintain.

type Fetcher

type Fetcher interface {
	// Fetch returns the body of URL and
	// a slice of URLs found on that page.
	Fetch(url string) (body string, urls []string, err error)
}

Fetcher defines an interface that can fetch URLs.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL