impl

package
v0.2.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 21, 2019 License: MIT Imports: 6 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Lexer

type Lexer struct {
	Err error // Optional error occurred during the parse
	// Represents a token that has been read the last time
	Tok struct {
		Kind  TokenKind
		Pos   Pos    // A zero-based line and column indexes of the token
		Value string // Optional value of the token
	}
	// contains filtered or unexported fields
}

func (*Lexer) Eof

func (lex *Lexer) Eof() bool

Eof gets a value indicating whether an end of file has been reached.

func (*Lexer) Next

func (lex *Lexer) Next()

func (*Lexer) Reset

func (lex *Lexer) Reset(buf []byte)

Reset prepares the lexer for new deserialization by assigning it with a buffer that contains the input JSON.

type Pos

type Pos struct {
	// contains filtered or unexported fields
}

Represents a zero-based position of the token in the input.

func (Pos) String

func (pos Pos) String() string

type TokenKind

type TokenKind int
const (
	TkEof TokenKind = iota
	TkString
	TkNumber
	TkTrue
	TkFalse
	TkCrBrOpen
	TkCrBrClose
	TkSqBrOpen
	TkSqBrClose
	TkColon
	TkComma
	TkNull
)

func (TokenKind) MessageString

func (tk TokenKind) MessageString() string

MessageString gets a representation of the token kind based on whether it corresponds to a known character sequence or must correspond an arbitrary value.

func (TokenKind) String

func (tk TokenKind) String() string

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL