README
¶
JSON

This package is a JSON lexer (ECMA-404) written in Go. It follows the specification at JSON. The lexer takes an io.Reader and converts it into tokens until the EOF.
Installation
Run the following command
go get github.com/tdewolff/parse/json
or add the following import and run project with go get
import "github.com/tdewolff/parse/json"
Parser
Usage
The following initializes a new Parser with io.Reader r
:
p := json.NewParser(r)
To tokenize until EOF an error, use:
for {
gt, text := p.Next()
switch gt {
case json.ErrorGrammar:
// error or EOF set in p.Err()
return
// ...
}
}
All grammars:
ErrorGrammar GrammarType = iota // extra grammar when errors occur
WhitespaceGrammar // space \t \r \n
LiteralGrammar // null true false
NumberGrammar
StringGrammar
StartObjectGrammar // {
EndObjectGrammar // }
StartArrayGrammar // [
EndArrayGrammar // ]
Examples
package main
import (
"os"
"github.com/tdewolff/parse/json"
)
// Tokenize JSON from stdin.
func main() {
p := json.NewParser(os.Stdin)
for {
gt, text := p.Next()
switch gt {
case json.ErrorGrammar:
if p.Err() != io.EOF {
fmt.Println("Error on line", p.Line(), ":", p.Err())
}
return
case json.LiteralGrammar:
fmt.Println("Literal", string(text))
case json.NumberGrammar:
fmt.Println("Number", string(text))
// ...
}
}
}
License
Released under the MIT license.
Documentation
¶
Overview ¶
Package json is a JSON parser following the specifications at http://json.org/.
Index ¶
Examples ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type GrammarType ¶
type GrammarType uint32
GrammarType determines the type of grammar
const ( ErrorGrammar GrammarType = iota // extra grammar when errors occur WhitespaceGrammar LiteralGrammar NumberGrammar StringGrammar StartObjectGrammar // { EndObjectGrammar // } StartArrayGrammar // [ EndArrayGrammar // ] )
GrammarType values.
func (GrammarType) String ¶
func (gt GrammarType) String() string
String returns the string representation of a GrammarType.
type Parser ¶
type Parser struct {
// contains filtered or unexported fields
}
Parser is the state for the lexer.
func (*Parser) Err ¶
Err returns the error encountered during tokenization, this is often io.EOF but also other errors can be returned.
func (*Parser) Next ¶
func (p *Parser) Next() (GrammarType, []byte)
Next returns the next Grammar. It returns ErrorGrammar when an error was encountered. Using Err() one can retrieve the error message.