Documentation ¶
Overview ¶
Package html minifies HTML5 following the specifications at http://www.w3.org/TR/html5/syntax.html.
Index ¶
Examples ¶
Constants ¶
This section is empty.
Variables ¶
var DefaultMinifier = &Minifier{}
DefaultMinifier is the default minifier.
Functions ¶
func Minify ¶
Minify minifies HTML data, it reads from r and writes to w.
Example ¶
m := minify.New() m.AddFunc("text/html", Minify) m.AddFunc("text/css", css.Minify) m.AddFunc("image/svg+xml", svg.Minify) m.AddFuncRegexp(regexp.MustCompile("^(application|text)/(x-)?(java|ecma)script$"), js.Minify) m.AddFuncRegexp(regexp.MustCompile("[/+]json$"), json.Minify) m.AddFuncRegexp(regexp.MustCompile("[/+]xml$"), xml.Minify) // set URL to minify link locations too m.URL, _ = url.Parse("https://www.example.com/") if err := m.Minify("text/html", os.Stdout, os.Stdin); err != nil { panic(err) }
Output:
Example (Options) ¶
m := minify.New() m.Add("text/html", &Minifier{ KeepDefaultAttrVals: true, KeepWhitespace: true, }) if err := m.Minify("text/html", os.Stdout, os.Stdin); err != nil { panic(err) }
Output:
Example (Reader) ¶
b := bytes.NewReader([]byte("<html><body><h1>Example</h1></body></html>")) m := minify.New() m.Add("text/html", &Minifier{}) r := m.Reader("text/html", b) if _, err := io.Copy(os.Stdout, r); err != nil { panic(err) }
Output: <h1>Example</h1>
Example (Writer) ¶
m := minify.New() m.Add("text/html", &Minifier{}) w := m.Writer("text/html", os.Stdout) w.Write([]byte("<html><body><h1>Example</h1></body></html>")) w.Close()
Output: <h1>Example</h1>
Types ¶
type Minifier ¶
type Minifier struct { KeepConditionalComments bool KeepDefaultAttrVals bool KeepDocumentTags bool KeepEndTags bool KeepWhitespace bool }
Minifier is an HTML minifier.
type Token ¶ added in v1.1.0
type Token struct { html.TokenType Hash html.Hash Data []byte Text []byte AttrVal []byte Traits traits }
Token is a single token unit with an attribute value (if given) and hash of the data.
type TokenBuffer ¶ added in v1.1.0
type TokenBuffer struct {
// contains filtered or unexported fields
}
TokenBuffer is a buffer that allows for token look-ahead.
func NewTokenBuffer ¶ added in v1.1.0
func NewTokenBuffer(l *html.Lexer) *TokenBuffer
NewTokenBuffer returns a new TokenBuffer.
func (*TokenBuffer) Attributes ¶
func (z *TokenBuffer) Attributes(hashes ...html.Hash) []*Token
Attributes extracts the gives attribute hashes from a tag. It returns in the same order pointers to the requested token data or nil.
func (*TokenBuffer) Peek ¶ added in v1.1.0
func (z *TokenBuffer) Peek(pos int) *Token
Peek returns the ith element and possibly does an allocation. Peeking past an error will panic.
func (*TokenBuffer) Shift ¶ added in v1.1.0
func (z *TokenBuffer) Shift() *Token
Shift returns the first element and advances position.