twpurge

package
v1.0.3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 7, 2023 License: MIT Imports: 10 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

View Source
var MatchDefault = func(fn string) bool {
	ext := strings.ToLower(filepath.Ext(fn))
	switch ext {
	case ".html", ".vugu", ".jsx", ".vue":
		return true
	}
	return false
}

MatchDefault is a filename matcher function which will return true for files end in .html, .vugu, .jsx or .vue.

Functions

func PurgeKeysFromDist

func PurgeKeysFromDist(dist Dist) (map[string]struct{}, error)

FIXME: this should probably be called RuleNamesFromDist, and document the idea of "rule names" vs "purge keys". PurgeKeysFromDist runs PurgeKeysFromReader on the appropriate(s) file from the dist. A check is done to see if Dist implements interface { PurgeKeyMap() map[string]struct{} } and this is used if avialable. Otherwise the appropriate files(s) are processed from the dist using PurgeKeysFromReader.

func PurgeKeysFromReader

func PurgeKeysFromReader(cssR io.Reader) (map[string]struct{}, error)

PurgeKeysFromReader parses the contents of this reader as CSS and builds a map of purge keys.

Types

type Checker

type Checker interface {
	ShouldPurgeKey(k string) bool
}

Checker is implemented by something that can answer the question "should this CSS rule be purged from the output because it is unused".

type DefaultTokenizer

type DefaultTokenizer struct {
	// contains filtered or unexported fields
}

DefaultTokenizer implements Tokenizer with a sensible default tokenization.

func NewDefaultTokenizer

func NewDefaultTokenizer(r io.Reader) *DefaultTokenizer

func (*DefaultTokenizer) NextToken

func (t *DefaultTokenizer) NextToken() ([]byte, error)

type Dist

type Dist interface {
	OpenDist(name string) (io.ReadCloser, error)
}

Dist matches tailwind.Dist

type Map

type Map map[string]struct{}

Map is a set of strings that implements Checker. The output of a Scanner is a Map that can be used to during conversion to rapidly check if a style rule needs to be output.

func (Map) Merge

func (m Map) Merge(fromMap Map)

func (Map) ShouldPurgeKey

func (m Map) ShouldPurgeKey(k string) bool

ShouldPurgeKey implements Checker.

type Scanner

type Scanner struct {
	// contains filtered or unexported fields
}

Scanner scans through textual files (generally HTML-like content) and looks for tokens to be preserved when purging. The scanning is intentionally naive in order to keep it's rules simple to understand and reasonbly performant. (TODO: explain more)

func NewScanner

func NewScanner(ruleNames map[string]struct{}) *Scanner

func NewScannerFromDist

func NewScannerFromDist(dist Dist) (*Scanner, error)

func (*Scanner) Map

func (s *Scanner) Map() Map

Map returns the Map which is the result of all previous Scan calls.

func (*Scanner) Scan

func (s *Scanner) Scan(r io.Reader) error

func (*Scanner) ScanFile

func (s *Scanner) ScanFile(fpath string) error

func (*Scanner) WalkFunc

func (s *Scanner) WalkFunc(fnmatch func(fn string) bool) filepath.WalkFunc

WalkFunc returns a function which can be called by filepath.Walk to scan each matching file encountered. The fnmatch func says which files to scan, if nil is passed then MatchDefault will be used.

type Tokenizer

type Tokenizer interface {
	NextToken() ([]byte, error) // returns a token or error (not both), io.EOF indicates end of stream
}

Tokenizer returns the next token from a markup file.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL