scanner

package
v1.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 3, 2024 License: Apache-2.0 Imports: 5 Imported by: 0

Documentation

Overview

Package scanner implements the PL/0 scanner that performs a lexical analysis of the source code.

Index

Constants

View Source
const (
	Unknown = Token(iota)
	Identifier
	Number
	Plus
	Minus
	Times
	Divide
	Equal
	NotEqual
	Less
	LessEqual
	Greater
	GreaterEqual
	LeftParenthesis
	RightParenthesis
	Comma
	Colon
	Semicolon
	ProgramEnd
	Becomes
	Read
	Write
	OddWord
	BeginWord
	EndWord
	IfWord
	ThenWord
	WhileWord
	DoWord
	CallWord
	ConstWord
	VarWord
	ProcedureWord
)

Tokens that are supported by the PL/0 scanner.

View Source
const EndOfFileCharacter = 0

Last character of the source code that is read when the end of the file is reached.

Variables

View Source
var (
	// Empty is an empty token set.
	Empty = Tokens{}

	// TokenNames maps tokens to their string representation.
	TokenNames = map[Token]string{
		Unknown:          "unknown",
		Identifier:       "identifier",
		Number:           "number",
		Plus:             "plus",
		Minus:            "minus",
		Times:            "times",
		Divide:           "divide",
		Equal:            "equal",
		NotEqual:         "notEqual",
		Less:             "less",
		LessEqual:        "lessEqual",
		Greater:          "greater",
		GreaterEqual:     "greaterEqual",
		LeftParenthesis:  "leftParenthesis",
		RightParenthesis: "rightParenthesis",
		Comma:            "comma",
		Colon:            "colon",
		Semicolon:        "semicolon",
		ProgramEnd:       "programEnd",
		Becomes:          "becomes",
		Read:             "read",
		Write:            "write",
		OddWord:          "odd",
		BeginWord:        "begin",
		EndWord:          "end",
		IfWord:           "if",
		ThenWord:         "then",
		WhileWord:        "while",
		DoWord:           "do",
		CallWord:         "call",
		ConstWord:        "const",
		VarWord:          "var",
		ProcedureWord:    "procedure",
	}
)

Functions

This section is empty.

Types

type Scanner

type Scanner interface {
	Scan(content []byte) (TokenStream, error)
}

Scanner is the public interface of the scanner implementation.

func NewScanner

func NewScanner() Scanner

Return the public interface of the private scanner implementation.

type Token

type Token int

Token is a type that represents a token in the source code.

func (Token) In

func (token Token) In(set Tokens) bool

In returns true if the token is in the given tokens.

func (Token) ToTokens

func (t Token) ToTokens() Tokens

Token.ToTokens concerts a token to a token set to satisfy the TokenSet interface.

type TokenDescription

type TokenDescription struct {
	Token        Token
	TokenName    string
	TokenValue   string
	Line, Column int
	CurrentLine  []byte
}

Describes a token with its kind, name, value, datatype, and position in the source code.

type TokenSet

type TokenSet interface {
	ToTokens() Tokens
}

TokenSet is an interface that is used for types that can be converted to the 'Tokens' type.

type TokenStream added in v1.1.0

type TokenStream []TokenDescription

The token stream table of token descriptions is the result of the lexical analysis of the source code. It is consumed by the parser.

type Tokens

type Tokens []Token

Tokens represents a set of tokens.

func Set

func Set(tss ...TokenSet) Tokens

Set returns a joined slice of all tokens within the given TokenSet interfaces. Redundant tokens are removed.

func (Tokens) ToTokens

func (t Tokens) ToTokens() Tokens

Tokens.ToTokens simply returns the token set that is passed in to satisfy the TokenSet interface.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL