Documentation ¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
View Source
var ( // CloseTagPattern represents the regular expression pattern for close tags. CloseTagPattern = regexp.MustCompile(`^([0-9a-zA-Z]{2,16})?(%{1,16})(>)`) // OpenTagPattern represents the regular expression pattern for open tags. OpenTagPattern = regexp.MustCompile(`^(<)(%{1,16})([0-9a-zA-Z]{2,16})?`) )
Functions ¶
This section is empty.
Types ¶
type CloseTagRule ¶
type CloseTagRule struct {
// contains filtered or unexported fields
}
CloseTagRule represents a rule for close tags.
func NewCloseTagRule ¶
func NewCloseTagRule() *CloseTagRule
NewCloseTagRule creates a new instance of CloseTagRule.
type EOFRule ¶
type EOFRule struct{}
EOFRule represents a rule for end-of-file.
type Error ¶
type Error struct { // Code is the error code. Code errorCode // Pos is the position where the error occurred. Pos int }
Error represents an error that occurred during lexing.
type Lexer ¶
type Lexer struct {
// contains filtered or unexported fields
}
Lexer represents a template lexer.
func NewWithRules ¶
NewWithRules creates a new instance of Lexer with the given source reader and custom rules.
type OpenTagRule ¶
type OpenTagRule struct {
// contains filtered or unexported fields
}
OpenTagRule represents a rule for open tags.
func NewOpenTagRule ¶
func NewOpenTagRule() *OpenTagRule
NewOpenTagRule creates a new instance of OpenTagRule.
type Rule ¶
type Rule interface { // Test tests if the rule matches the current input. Test(*sourceReader) (tokenType token.TokenType, ok bool) // Tokenize generates a token from the input. Tokenize(*sourceReader) (length int, more bool) }
Rule defines the interface for lexer rules.
Click to show internal directories.
Click to hide internal directories.