Documentation ¶
Overview ¶
Package token is generated by GoGLL. Do not edit
Index ¶
- Variables
- type Token
- func (t *Token) GetInput() []rune
- func (t *Token) GetLineColumn() (line, col int)
- func (t *Token) Lext() int
- func (t *Token) Literal() []rune
- func (t *Token) LiteralString() string
- func (t *Token) LiteralStringStripEscape() string
- func (t *Token) LiteralStripEscape() []rune
- func (t *Token) Rext() int
- func (t *Token) String() string
- func (t *Token) Suppress() bool
- func (t *Token) Type() Type
- func (t *Token) TypeID() string
- type Type
Constants ¶
This section is empty.
Variables ¶
View Source
var IDToType = map[string]Type{
"Error": 0,
"$": 1,
"id": 2,
"id_continue": 3,
"id_start": 4,
}
View Source
var StringToType = map[string]Type{ "Error": Error, "EOF": EOF, "T_0": T_0, "T_1": T_1, "T_2": T_2, }
View Source
var TypeToID = []string{
"Error",
"$",
"id",
"id_continue",
"id_start",
}
View Source
var TypeToString = []string{
"Error",
"EOF",
"T_0",
"T_1",
"T_2",
}
Functions ¶
This section is empty.
Types ¶
type Token ¶
type Token struct {
// contains filtered or unexported fields
}
Token is returned by the lexer for every scanned lexical token
func New ¶
New returns a new token. lext is the left extent and rext the right extent of the token in the input. input is the input slice scanned by the lexer.
func (*Token) GetLineColumn ¶
GetLineColumn returns the line and column of the left extent of t
func (*Token) LiteralString ¶
LiteralString returns string(t.Literal())
func (*Token) LiteralStringStripEscape ¶
LiteralStringStripEscape returns string(t.LiteralStripEscape())
func (*Token) LiteralStripEscape ¶
LiteralStripEscape returns the literal runes of t scanned by the lexer
Click to show internal directories.
Click to hide internal directories.