tokenizer

package
v0.0.0-...-3f75658 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 12, 2014 License: Artistic-2.0 Imports: 4 Imported by: 0

Documentation

Overview

package tokenizer tokenizes a css stream. Follows the spec defined at http://www.w3.org/TR/css-syntax/#tokenization

Index

Constants

View Source
const (
	Ident          tokenType = iota // 0
	AtKeyword                       // 1
	String                          // 2
	BadString                       // 3
	BadUri                          // 4
	BadComment                      // 5
	Hash                            // 6
	Number                          // 7
	Percentage                      // 8
	Dimension                       // 9
	Uri                             // 10
	UnicodeRange                    // 11
	CDO                             // 12
	CDC                             // 13
	Colon                           // 14
	Semicolon                       // 15
	Comma                           // 16
	LBrace                          // 17
	RBrace                          // 18
	LParen                          // 19
	RParen                          // 20
	LBracket                        // 21
	RBracket                        // 22
	Includes                        // 23
	Prefixmatch                     // 24
	Suffixmatch                     // 25
	Dashmatch                       // 26
	Comment                         // 27
	Function                        // 28
	Delim                           // 29
	SubstringMatch                  // 30
	Column                          // 31
	WS                              // 32
)

Variables

View Source
var EUnexpextedEOF = fmt.Errorf("Unexpected EOF.")

Functions

This section is empty.

Types

type Position

type Position struct {
	Line   int
	Column int
}

type PositionTrackingScanner

type PositionTrackingScanner struct {
	// Embedded Scanner so our LineTrackingReader can be used just like
	// a Scanner.
	*bufio.Scanner
	// contains filtered or unexported fields
}

TODO(jwall): We need to sanitize the stream first? \r, \f, or \r\f turn into \n

func NewTrackingReader

func NewTrackingReader(r io.Reader, splitFunc func(data []byte, atEOF bool) (advance int, token []byte, err error)) *PositionTrackingScanner

func (*PositionTrackingScanner) Position

func (l *PositionTrackingScanner) Position() Position

type Token

type Token struct {
	Position
	Type   tokenType
	String string
}

type Tokenizer

type Tokenizer struct {
	// contains filtered or unexported fields
}

func New

func New(r io.Reader) *Tokenizer

func (*Tokenizer) Next

func (t *Tokenizer) Next() (*Token, error)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL