Documentation ¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
View Source
var Builtins = map[string]string{
"LEN": "LEN",
"ABS": "ABS",
"ATN": "ATN",
"COS": "COS",
"EXP": "EXP",
"INT": "INT",
"LN": "LN",
"LOG": "LOG",
"RND": "RND",
"SGN": "SGN",
"SIN": "SIN",
"SQR": "SQR",
"TAN": "TAN",
"GET": "GET",
"LOOKUP": "LOOKUP",
"PATH$": "PATH$",
"STR$": "STR$",
"CHR$": "CHR$",
"PITCH": "PITCH",
}
This is a bit hacky:
Functions ¶
This section is empty.
Types ¶
type Lexer ¶
type Lexer struct { Source string // source code string Tokens []token.Token // buffer of tokens created by Scan() CurrentPosition int // position in the string // contains filtered or unexported fields }
Lexer describes a type with methods for lexing a line of source code and generating tokens. To tokenize a line of source code all we do is this:
s := &Scanner{} tokens := s.ScanTokens(source)
func (*Lexer) GetTokenPosition ¶
GetTokenPosition returns the current token position
func (*Lexer) JumpToToken ¶
JumpToToken sets the currentTokenPosition
func (*Lexer) NextToken ¶
NextToken returns the token at the current tokenPosition in the buffer and consumes it, moving to the next token
func (*Lexer) PeekToken ¶
PeekToken returns the token at the current tokenPosition in the buffer but does not consume it.
func (*Lexer) SetTokenPosition ¶
SetTokenPosition sets the current token position
Click to show internal directories.
Click to hide internal directories.