Documentation ¶
Overview ¶
package tokenizer tokenizes a css stream. Follows the spec defined at http://www.w3.org/TR/css-syntax/#tokenization
Index ¶
Constants ¶
View Source
const ( Ident tokenType = iota // 0 AtKeyword // 1 String // 2 BadString // 3 BadUri // 4 BadComment // 5 Hash // 6 Number // 7 Percentage // 8 Dimension // 9 Uri // 10 UnicodeRange // 11 CDO // 12 CDC // 13 Colon // 14 Semicolon // 15 Comma // 16 LBrace // 17 RBrace // 18 LParen // 19 RParen // 20 LBracket // 21 RBracket // 22 Includes // 23 Prefixmatch // 24 Suffixmatch // 25 Dashmatch // 26 Comment // 27 Function // 28 Delim // 29 SubstringMatch // 30 Column // 31 WS // 32 )
Variables ¶
View Source
var EUnexpextedEOF = fmt.Errorf("Unexpected EOF.")
Functions ¶
This section is empty.
Types ¶
type PositionTrackingScanner ¶
type PositionTrackingScanner struct { // Embedded Scanner so our LineTrackingReader can be used just like // a Scanner. *bufio.Scanner // contains filtered or unexported fields }
TODO(jwall): We need to sanitize the stream first? \r, \f, or \r\f turn into \n
func NewTrackingReader ¶
func (*PositionTrackingScanner) Position ¶
func (l *PositionTrackingScanner) Position() Position
Click to show internal directories.
Click to hide internal directories.