Documentation ¶
Index ¶
- Constants
- func FprintErrs(w io.Writer, errs []*Error, workDir string)
- func IsDigit(r rune) bool
- func IsHexDigit(r rune) bool
- func IsIdentLetter(r rune) bool
- func IsLetter(r rune) bool
- func IsPkgName(s string) bool
- func IsWhite(r rune) bool
- func IsWhiteOrEndl(r rune) bool
- func KeywordSet(words ...string) map[string]struct{}
- func LogError(log Logger, e error) bool
- func Tokens(tokener Tokener) ([]*Token, []*Error)
- type Error
- type ErrorList
- func (lst *ErrorList) Add(e *Error)
- func (lst *ErrorList) AddAll(es []*Error)
- func (lst *ErrorList) BailOut()
- func (lst *ErrorList) CodeErrorf(p *Pos, c, f string, args ...interface{})
- func (lst *ErrorList) Errorf(p *Pos, f string, args ...interface{})
- func (lst *ErrorList) Errs() []*Error
- func (lst *ErrorList) InJail() bool
- func (lst *ErrorList) Jail()
- func (lst *ErrorList) Print(w io.Writer) error
- type Keyworder
- type LexFunc
- type Lexer
- func (x *Lexer) Buffered() string
- func (x *Lexer) CodeErrorf(c, f string, args ...interface{})
- func (x *Lexer) Discard()
- func (x *Lexer) Ended() bool
- func (x *Lexer) Errorf(f string, args ...interface{})
- func (x *Lexer) Errs() []*Error
- func (x *Lexer) MakeToken(t int) *Token
- func (x *Lexer) Next() (rune, error)
- func (x *Lexer) Rune() rune
- func (x *Lexer) See(r rune) bool
- func (x *Lexer) SkipWhite()
- func (x *Lexer) Token() *Token
- type Logger
- type Parser
- func (p *Parser) Accept(t int) bool
- func (p *Parser) BailOut()
- func (p *Parser) CodeErrorf(pos *Pos, c, f string, args ...interface{})
- func (p *Parser) CodeErrorfHere(c, f string, args ...interface{})
- func (p *Parser) Errorf(pos *Pos, f string, args ...interface{})
- func (p *Parser) ErrorfHere(f string, args ...interface{})
- func (p *Parser) Errs() []*Error
- func (p *Parser) Expect(t int) *Token
- func (p *Parser) ExpectLit(t int, lit string) *Token
- func (p *Parser) InError() bool
- func (p *Parser) Jail()
- func (p *Parser) Next() *Token
- func (p *Parser) See(t int) bool
- func (p *Parser) SeeLit(t int, lit string) bool
- func (p *Parser) Shift() *Token
- func (p *Parser) SkipErrStmt(sep int) bool
- func (p *Parser) Token() *Token
- func (p *Parser) TypeStr(t int) string
- type Pos
- type Recorder
- type Remover
- type Token
- type Tokener
- type Types
- type WhiteFunc
Constants ¶
const ( EOF = -1 - iota Comment Illegal )
Standard token types
const ( Word = iota Punc )
Token types for the example lexer.
Variables ¶
This section is empty.
Functions ¶
func FprintErrs ¶
FprintErrs prints a list of errors.
func IsHexDigit ¶
IsHexDigit returns true when the rune is in 0-9, a-f or A-F
func IsIdentLetter ¶
IsIdentLetter checks if rune r can start an identifier.
func IsWhite ¶
IsWhite is the default IsWhite function for a lexer. Returns true on spaces, \t and \r. Returns false on \n.
func IsWhiteOrEndl ¶
IsWhiteOrEndl is another IsWhite function that also returns true for \n.
func KeywordSet ¶
KeywordSet creates a keyword set.
Types ¶
type Error ¶
type Error struct { Pos *Pos // Pos can be null for error not related to any position Err error // Err is the error message, human friendly. Code string // Code is the error code, machine friendly. }
Error is a parsing error
func CodeErrorf ¶
CodeErrorf creates a lex8.Error with ErrCode
func SingleCodeErr ¶
SingleCodeErr returns an error array with one error with ErrorCode.
func (*Error) ErrorRelFile ¶
ErrorRelFile returns the error relative to the given workDir
type ErrorList ¶
type ErrorList struct { Max int // contains filtered or unexported fields }
ErrorList saves a list of error
func NewErrorList ¶
func NewErrorList() *ErrorList
NewErrorList creates a new error list with default (20) maximum lines of errors.
func (*ErrorList) CodeErrorf ¶
CodeErrorf appends a new error with a ErrorCode.
func (*ErrorList) InJail ¶
InJail checks if a new error has been added since created or last bail out
type Keyworder ¶
type Keyworder struct { Keywords map[string]struct{} Ident int Keyword int // contains filtered or unexported fields }
Keyworder contains idents into keywords
func NewKeyworder ¶
NewKeyworder creates a new tokener that changes the type of a token into keywords if it is in the keyword map.
type Lexer ¶
Lexer parses a file input stream into tokens.
func NewCommentLexer ¶
NewCommentLexer returns a lexer that parse only comments.
func NewWordLexer ¶
NewWordLexer returns an example lexer that parses a file into words and punctuations.
func (*Lexer) CodeErrorf ¶
CodeErrorf adds an error into the error list with error code.
func (*Lexer) MakeToken ¶
MakeToken accepts the runes in the scanning buffer and returns it as a token of type t.
func (*Lexer) Next ¶
Next pushes the current rune into the scanning buffer, and returns the next rune.
type Parser ¶
type Parser struct {
// contains filtered or unexported fields
}
Parser provides the common parser functions for parsing. It does NOT provide a working parser for any grammar.
func (*Parser) Accept ¶
Accept shifts the tokener by one token and returns true if the current token is of type t. It otherwise returns false and nothing happens.
func (*Parser) BailOut ¶
func (p *Parser) BailOut()
BailOut bails out the parser from an error state.
func (*Parser) CodeErrorf ¶
CodeErrorf adds a new parser error with a error code
func (*Parser) CodeErrorfHere ¶
CodeErrorfHere adds a new parser error at the current token position
func (*Parser) Errorf ¶
Errorf adds a new parser error to the parser's error list at a particular position.
func (*Parser) ErrorfHere ¶
ErrorfHere adds a new parser error at the current token position
func (*Parser) Errs ¶
Errs returns the parsing error list if the lexing has no error. If lexing has error, it returns the lexing error list instead.
func (*Parser) Expect ¶
Expect checks if the current token is type t. If it is, the token is accepted, the current token is shifted, and it returns the accepted token. If it is not, the call reports an error, enters the parser into error state, and returns nil. If the parser is already in error state, the call returns nil immediately, and nothing is checked.
func (*Parser) ExpectLit ¶
ExpectLit checks if the current token is type t and has literal lit. If it is, the token is accepted, the current token is shifted, and it returns the accepted token. If it is not, the call reports an error, enters the parser into error state, and returns nil. If the parser is already in error state, the call returns nil immediately, and nothing is checked.
func (*Parser) InError ¶
InError checks if the parser is in error state. A parser can enter error state by adding a parser error with Errorf() or ErrorfAt(). A parser leaves error by calling BailOut().
func (*Parser) Jail ¶
func (p *Parser) Jail()
Jail puts the parser in error state without adding an error message.
func (*Parser) SkipErrStmt ¶
SkipErrStmt skips tokens until it meets a token of type sep or the end of file (token EOF) and returns true, but only when the parser is in error state. If the parser is not in error state, it returns false and nothing is skipped.
type Recorder ¶
type Recorder struct { Tokener // contains filtered or unexported fields }
Recorder is a token filter that records all the token a tokener generates
func NewRecorder ¶
NewRecorder creates a new recorder that filters the tokener
type Remover ¶
type Remover struct { Tokener // contains filtered or unexported fields }
Remover removes a particular type of token from a token stream
func NewCommentRemover ¶
NewCommentRemover creates a new remover that removes token
func NewRemover ¶
NewRemover creates a new remover that removes token of type t
type Token ¶
Token defines a token structure.
func LexComment ¶
LexComment lexes a c style comment. It is not a complete LexFunc, where it assumes that there is already a "/" buffered in the lexer as a precondition.
func LexRawString ¶
LexRawString parses a raw string token with type t, which is quoted in a pair of `
type Tokener ¶
type Tokener interface { // Token returns the next token Token() *Token // Errs returns the error list on tokening Errs() []*Error }
Tokener is token emitting interface.