lexer

package
v0.0.0-...-d03bbad Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Aug 7, 2018 License: MIT Imports: 4 Imported by: 1

Documentation

Overview

Package lexer implements the lexical scanners of javascript source code into a sequence of tokens.

Index

Constants

This section is empty.

Variables

View Source
var EOF = Tokval{Type: token.EOF, Value: utf16.S("EOF")}

EOF is the End of File token.

Functions

func Lex

func Lex(code utf16.Str) <-chan Tokval

Lex will lex the given crappy JS code (utf16 yay) and provide a stream of tokens as a result (the returned channel).

The caller should iterate on the given channel until it is closed indicating a EOF (or an error). Errors should be handled by checking the type of the token.

A goroutine will be started to lex the given code, if you do not iterate the returned channel the goroutine will leak, you MUST drain the provided channel.

Types

type Tokval

type Tokval struct {
	Type   token.Type
	Value  utf16.Str
	Line   uint
	Column uint
}

func (Tokval) Equal

func (t Tokval) Equal(other Tokval) bool

Equal tells if token is the same as other.

func (Tokval) EqualPos

func (t Tokval) EqualPos(other Tokval) bool

func (Tokval) String

func (t Tokval) String() string

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL