Documentation ¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func IsHexDigit ¶ added in v0.2.0
func IsValidEscapedSymbol ¶ added in v0.2.0
func StringDeepCopy ¶ added in v0.2.0
StringDeepCopy creates a copy of the given string with it's own underlying bytearray. Use this function to make a copy of string returned by Token()
Types ¶
type JSONLexer ¶
type JSONLexer struct {
// contains filtered or unexported fields
}
JSONLexer is a JSON lexical analyzer with streaming API support, where stream is a sequence of JSON tokens. JSONLexer does its own IO buffering so prefer low-level readers if you want to miminize memory footprint.
JSONLexer uses a ring buffer for parsing tokens, every token must fit in its size, otherwise buffer will be automatically grown. Initial size of buffer is 4096 bytes, however you can tweak it with SetBufSize() in case you know that most tokens are going to be long.
JSONLexer uses unsafe pointers into the underlying buf to minimize allocations, see Token() for the provided guarantees.
func NewJSONLexer ¶
NewJSONLexer creates a new JSONLexer with the given reader.
func (*JSONLexer) SetBufSize ¶
SetBufSize creates a new buffer of the given size. MUST be called before parsing started.
func (*JSONLexer) SetSkipDelims ¶
SetSkipDelims tells JSONLexer to skip delimiters and return only keys and values. This can be useful in case you want to simply match the input to some specific grammar and have no intention of doing full syntax analysis.
func (*JSONLexer) Token ¶
Token returns the next JSON token. All strings returned by Token are guaranteed to be valid until the next Token call, otherwise you MUST make a deep copy.
func (*JSONLexer) TokenFast ¶ added in v0.2.0
func (l *JSONLexer) TokenFast() (TokenGeneric, error)
TokenFast is a more efficient version of Token(). All strings returned by Token are guaranteed to be valid until the next Token call, otherwise you MUST make a deep copy.
type TokenGeneric ¶ added in v0.2.0
type TokenGeneric struct {
// contains filtered or unexported fields
}
func (*TokenGeneric) Bool ¶ added in v0.2.0
func (t *TokenGeneric) Bool() bool
func (*TokenGeneric) Delim ¶ added in v0.2.0
func (t *TokenGeneric) Delim() byte
func (*TokenGeneric) IsNull ¶ added in v0.2.0
func (t *TokenGeneric) IsNull() bool
func (*TokenGeneric) Number ¶ added in v0.2.0
func (t *TokenGeneric) Number() float64
func (*TokenGeneric) String ¶ added in v0.2.0
func (t *TokenGeneric) String() string
String returns string that points into internal lexer buffer and is guaranteed to be valid until the next Token call, otherwise you MUST make a deep copy
func (*TokenGeneric) StringCopy ¶ added in v0.2.0
func (t *TokenGeneric) StringCopy() string
StringCopy return a deep copy of string
func (*TokenGeneric) Type ¶ added in v0.2.0
func (t *TokenGeneric) Type() TokenType