Documentation ¶
Overview ¶
Package scanner implements the PL/0 scanner that performs a lexical analysis of the source code.
Index ¶
Constants ¶
const ( Unknown = Token(iota) Identifier Number Plus Minus Times Divide Equal NotEqual Less LessEqual Greater GreaterEqual LeftParenthesis RightParenthesis Comma Colon Semicolon ProgramEnd Becomes Read Write OddWord BeginWord EndWord IfWord ThenWord WhileWord DoWord CallWord ConstWord VarWord ProcedureWord )
Tokens that are supported by the PL/0 scanner.
const EndOfFileCharacter = 0
Last character of the source code that is read when the end of the file is reached.
Variables ¶
var ( // Empty is an empty token set. Empty = Tokens{} // TokenNames maps tokens to their string representation. TokenNames = map[Token]string{ Unknown: "unknown", Identifier: "identifier", Number: "number", Plus: "plus", Minus: "minus", Times: "times", Divide: "divide", Equal: "equal", NotEqual: "notEqual", Less: "less", LessEqual: "lessEqual", Greater: "greater", GreaterEqual: "greaterEqual", LeftParenthesis: "leftParenthesis", RightParenthesis: "rightParenthesis", Comma: "comma", Colon: "colon", Semicolon: "semicolon", ProgramEnd: "programEnd", Becomes: "becomes", Read: "read", Write: "write", OddWord: "odd", BeginWord: "begin", EndWord: "end", IfWord: "if", ThenWord: "then", WhileWord: "while", DoWord: "do", CallWord: "call", ConstWord: "const", VarWord: "var", ProcedureWord: "procedure", } )
Functions ¶
This section is empty.
Types ¶
type Scanner ¶
type Scanner interface {
Scan(content []byte) (TokenStream, error)
}
Scanner is the public interface of the scanner implementation.
func NewScanner ¶
func NewScanner() Scanner
Return the public interface of the private scanner implementation.
type Token ¶
type Token int
Token is a type that represents a token in the source code.
type TokenDescription ¶
type TokenDescription struct { Token Token TokenName string TokenValue string Line, Column int CurrentLine []byte }
Describes a token with its kind, name, value, datatype, and position in the source code.
type TokenSet ¶
type TokenSet interface {
ToTokens() Tokens
}
TokenSet is an interface that is used for types that can be converted to the 'Tokens' type.
type TokenStream ¶ added in v1.1.0
type TokenStream []TokenDescription
The token stream table of token descriptions is the result of the lexical analysis of the source code. It is consumed by the parser.