lexer

package
v0.3.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 16, 2024 License: MIT Imports: 8 Imported by: 1

Documentation

Index

Constants

This section is empty.

Variables

View Source
var (
	// CloseTagPattern represents the regular expression pattern for close tags.
	CloseTagPattern = regexp.MustCompile(`^([0-9a-zA-Z]{2,16})?(%{1,16})(>)`)
	// OpenTagPattern represents the regular expression pattern for open tags.
	OpenTagPattern = regexp.MustCompile(`^(<)(%{1,16})([0-9a-zA-Z]{2,16})?`)
)

Functions

This section is empty.

Types

type CloseTagRule

type CloseTagRule struct {
	// contains filtered or unexported fields
}

CloseTagRule represents a rule for close tags.

func NewCloseTagRule

func NewCloseTagRule() *CloseTagRule

NewCloseTagRule creates a new instance of CloseTagRule.

func (*CloseTagRule) Test

func (r *CloseTagRule) Test(sr *sourceReader) (token.TokenType, bool)

Test tests if the input matches the pattern for close tags.

func (CloseTagRule) Tokenize

func (r CloseTagRule) Tokenize(sr *sourceReader) (n int, more bool)

type EOFRule

type EOFRule struct{}

EOFRule represents a rule for end-of-file.

func (*EOFRule) Test

func (to *EOFRule) Test(sr *sourceReader) (token.TokenType, bool)

Test tests if the end-of-file rule matches the current input.

func (*EOFRule) Tokenize

func (to *EOFRule) Tokenize(sr *sourceReader) (length int, more bool)

Tokenize generates an end-of-file token.

type Error

type Error struct {
	// Code is the error code.
	Code errorCode
	// Pos is the position where the error occurred.
	Pos int
}

Error represents an error that occurred during lexing.

func (Error) Error

func (le Error) Error() string

Error returns the error message for the lexer error.

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

Lexer represents a template lexer.

func New

func New(source io.Reader) *Lexer

New creates a new instance of Lexer with the given source reader and default rules.

func NewWithRules

func NewWithRules(source io.Reader, rules ...Rule) *Lexer

NewWithRules creates a new instance of Lexer with the given source reader and custom rules.

func (*Lexer) NextToken

func (l *Lexer) NextToken() (*token.Token, error)

NextToken returns the next token from the input stream.

type OpenTagRule

type OpenTagRule struct {
	// contains filtered or unexported fields
}

OpenTagRule represents a rule for open tags.

func NewOpenTagRule

func NewOpenTagRule() *OpenTagRule

NewOpenTagRule creates a new instance of OpenTagRule.

func (*OpenTagRule) Test

func (r *OpenTagRule) Test(sr *sourceReader) (token.TokenType, bool)

Test tests if the input matches the pattern for open tags.

func (OpenTagRule) Tokenize

func (r OpenTagRule) Tokenize(sr *sourceReader) (n int, more bool)

type Rule

type Rule interface {
	// Test tests if the rule matches the current input.
	Test(*sourceReader) (tokenType token.TokenType, ok bool)
	// Tokenize generates a token from the input.
	Tokenize(*sourceReader) (length int, more bool)
}

Rule defines the interface for lexer rules.

type TextRule

type TextRule struct{}

TextRule represents a rule for text.

func (*TextRule) Test

func (r *TextRule) Test(sr *sourceReader) (token.TokenType, bool)

Test always returns true for text rule.

func (*TextRule) Tokenize

func (r *TextRule) Tokenize(sr *sourceReader) (n int, more bool)

Tokenize generates a text token.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL