tokenizer

package
v0.0.0-...-3da7ec1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 22, 2022 License: GPL-3.0 Imports: 4 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Scanner

type Scanner struct {
	// contains filtered or unexported fields
}

Scanner represents a non-discriminatory reader.

func NewScanner

func NewScanner(reader io.Reader) *Scanner

NewScanner returns a new instance of Scanner.

func (*Scanner) Scan

func (scanner *Scanner) Scan() (tok Token, lit string)

Scan returns the next token and literal value.

type Token

type Token int

Token represents a lexical token.

const (
	// Illegal
	ILLEGAL Token = iota

	// Whitespace
	EOF
	NL
	TAB

	// Matches
	PAREN_OPEN
	PAREN_CLOSE
	BRACE_OPEN
	BRACE_CLOSE
	BRACKET_OPEN
	BRACKET_CLOSE

	// Arrows
	FAT_ARROW
	SLIM_ARROW

	// Comparison
	LT
	GT
	LTE
	GTE
	EQ
	EQEQ
	UNEQ

	// Hyphen
	HYPHEN

	// Arithmetic
	PLUS
	MINUS
	MUL
	DIV

	// Context
	PERIOD_MARK
	QUESTION_MARK
	EXCLAIM_MARK

	// Literals
	IDENT
)

type TokenType

type TokenType struct {
	Token   Token
	Literal string
}

func Tokenize

func Tokenize(raw_str string) []TokenType

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL