read

package
v0.0.0-...-31ab3be Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 18, 2019 License: MIT Imports: 5 Imported by: 0

Documentation

Overview

Package read provides the system's standard lexer and reader

Index

Constants

View Source
const (
	UnmatchedState      = "unmatched lexing state"
	StringNotTerminated = "string has no closing quote"
)

Error messages

View Source
const (
	PrefixedNotPaired  = "end of file reached before completing %s"
	InvalidListSyntax  = "invalid list syntax"
	ListNotClosed      = "end of file reached with open list"
	UnmatchedListEnd   = "encountered ')' with no open list"
	VectorNotClosed    = "end of file reached with open vector"
	UnmatchedVectorEnd = "encountered ']' with no open vector"
	MapNotClosed       = "end of file reached with open map"
	UnmatchedMapEnd    = "encountered '}' with no open map"
)

Error messages

Variables

This section is empty.

Functions

func FromScanner

func FromScanner(lexer data.Sequence) data.Sequence

FromScanner returns a Lazy Sequence of scanned data structures

func FromString

func FromString(src data.String) data.Sequence

FromString converts the raw source into unexpanded data structures

func Scan

func Scan(src data.String) data.Sequence

Scan creates a new lexer Sequence

Types

type Token

type Token struct {
	Type  TokenType
	Value data.Value
}

Token is a lexer value

func (*Token) String

func (t *Token) String() string

String converts this Value into a string

type TokenType

type TokenType int

TokenType is an opaque type for lexer tokens

const (
	Error TokenType = iota
	Identifier
	String
	Number
	ListStart
	ListEnd
	VectorStart
	VectorEnd
	MapStart
	MapEnd
	QuoteMarker
	SyntaxMarker
	UnquoteMarker
	SpliceMarker
	PatternMarker
	Whitespace
	Comment
)

Token Types

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL