Documentation
¶
Overview ¶
Package token is generated by GoGLL. Do not edit
Index ¶
- Variables
- type Token
- func (t *Token) GetInput() []rune
- func (t *Token) GetLineColumn() (line, col int)
- func (t *Token) Lext() int
- func (t *Token) Literal() []rune
- func (t *Token) LiteralString() string
- func (t *Token) Rext() int
- func (t *Token) String() string
- func (t *Token) Suppress() bool
- func (t *Token) Type() Type
- func (t *Token) TypeID() string
- type Type
Constants ¶
This section is empty.
Variables ¶
View Source
var StringToType = map[string]Type{ "Error": Error, "EOF": EOF, "T_0": T_0, "T_1": T_1, "T_2": T_2, "T_3": T_3, "T_4": T_4, "T_5": T_5, "T_6": T_6, "T_7": T_7, "T_8": T_8, "T_9": T_9, "T_10": T_10, "T_11": T_11, "T_12": T_12, "T_13": T_13, "T_14": T_14, "T_15": T_15, "T_16": T_16, "T_17": T_17, "T_18": T_18, "T_19": T_19, "T_20": T_20, "T_21": T_21, "T_22": T_22, "T_23": T_23, "T_24": T_24, "T_25": T_25, "T_26": T_26, "T_27": T_27, "T_28": T_28, "T_29": T_29, }
View Source
var Suppress = []bool{ false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, }
View Source
var TypeToID = []string{
"Error",
"$",
"!",
"&",
"(",
")",
"*",
"+",
".",
"/",
":",
";",
"<",
">",
"?",
"[",
"]",
"any",
"char_lit",
"empty",
"letter",
"lowcase",
"not",
"nt",
"number",
"package",
"string_lit",
"tokid",
"upcase",
"{",
"|",
"}",
}
View Source
var TypeToString = []string{
"Error",
"EOF",
"T_0",
"T_1",
"T_2",
"T_3",
"T_4",
"T_5",
"T_6",
"T_7",
"T_8",
"T_9",
"T_10",
"T_11",
"T_12",
"T_13",
"T_14",
"T_15",
"T_16",
"T_17",
"T_18",
"T_19",
"T_20",
"T_21",
"T_22",
"T_23",
"T_24",
"T_25",
"T_26",
"T_27",
"T_28",
"T_29",
}
Functions ¶
This section is empty.
Types ¶
type Token ¶
type Token struct {
// contains filtered or unexported fields
}
Token is returned by the lexer for every scanned lexical token
func New ¶
New returns a new token. lext is the left extent and rext the right extent of the token in the input. input is the input slice scanned by the lexer.
func (*Token) GetLineColumn ¶
GetLineColumn returns the line and column of the left extent of t
func (*Token) LiteralString ¶
LiteralString returns string(t.Literal())
type Type ¶
type Type int
Type is the token type
const ( Error Type = iota // Error EOF // $ T_0 // ! T_1 // & T_2 // ( T_3 // ) T_4 // * T_5 // + T_6 // . T_7 // / T_8 // : T_9 // ; T_10 // < T_11 // > T_12 // ? T_13 // [ T_14 // ] T_15 // any T_16 // char_lit T_17 // empty T_18 // letter T_19 // lowcase T_20 // not T_21 // nt T_22 // number T_23 // package T_24 // string_lit T_25 // tokid T_26 // upcase T_27 // { T_28 // | T_29 // } )
Click to show internal directories.
Click to hide internal directories.