Documentation ¶
Overview ¶
Package staticgen provides static website generation from a live server.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Config ¶
type Config struct { // URL is the target website to crawl. Defaults to "http://127.0.0.1:3000". URL string `json:"url"` // Dir is the static website output directory. Defaults to "build". Dir string `json:"dir"` // Command is the optional server command executed before crawling. Command string `json:"command"` // Pages is a list of paths added to crawl, typically // including unlinked pages such as error pages, // landing pages and so on. Pages []string `json:"pages"` // Concurrency is the number of concurrent pages to crawl. Defaults to 30. Concurrency int `json:"concurrency"` // Allow404 can be enabled to opt-in to pages resulting in a 404, // which otherwise lead to an error. Allow404 bool `json:"allow_404"` }
Config is the static website generator configuration.
type EventStartedServer ¶
EventStartedServer .
type EventStartingServer ¶
EventStartingServer .
type EventVisitedResource ¶
type EventVisitedResource struct { Target Duration time.Duration StatusCode int Filename string Error error }
EventVisitedResource .
type Generator ¶
type Generator struct { // Config used for crawling and producing the static website. Config // HTTPClient ... HTTPClient *http.Client // contains filtered or unexported fields }
Generator is a static website generator.
func (*Generator) Run ¶
Run starts the configured server command, starts to perform crawling, and waits for completion before shutting down the configured server.
Directories ¶
Path | Synopsis |
---|---|
_examples
|
|
cmd
|
|
internal
|
|
crawler
Package crawler provides a website crawler.
|
Package crawler provides a website crawler. |
deduplicator
Package deduplicator provides URL deduplication.
|
Package deduplicator provides URL deduplication. |
Click to show internal directories.
Click to hide internal directories.