Documentation
¶
Overview ¶
Package parser is a simple helper package for parsing strings, byte slices and io.Readers
Index ¶
- Variables
- type Parser
- func (p *Parser) Accept(types ...TokenType) bool
- func (p *Parser) AcceptRun(types ...TokenType) TokenType
- func (p *Parser) AcceptToken(tokens ...Token) bool
- func (p *Parser) Done() (Phrase, PhraseFunc)
- func (p *Parser) Error() (Phrase, PhraseFunc)
- func (p *Parser) Except(types ...TokenType) bool
- func (p *Parser) ExceptRun(types ...TokenType) TokenType
- func (p *Parser) Get() []Token
- func (p *Parser) GetPhrase() (Phrase, error)
- func (p *Parser) GetToken() (Token, error)
- func (p *Parser) Len() int
- func (p *Parser) Peek() Token
- func (p *Parser) PhraserState(pf PhraseFunc)
- type Phrase
- type PhraseFunc
- type PhraseType
- type Token
- type TokenFunc
- type TokenType
- type Tokeniser
- func (t *Tokeniser) Accept(chars string) bool
- func (t *Tokeniser) AcceptRun(chars string) rune
- func (t *Tokeniser) Done() (Token, TokenFunc)
- func (t *Tokeniser) Error() (Token, TokenFunc)
- func (t *Tokeniser) Except(chars string) bool
- func (t *Tokeniser) ExceptRun(chars string) rune
- func (t *Tokeniser) Get() string
- func (t *Tokeniser) GetToken() (Token, error)
- func (t *Tokeniser) Len() int
- func (t *Tokeniser) Peek() rune
- func (t *Tokeniser) TokeniserState(tf TokenFunc)
Examples ¶
Constants ¶
This section is empty.
Variables ¶
var ( ErrNoState = errors.New("no state") ErrUnknownError = errors.New("unknown error") )
Errors
Functions ¶
This section is empty.
Types ¶
type Parser ¶
type Parser struct { Tokeniser // contains filtered or unexported fields }
Parser is a type used to get tokens or phrases (collection of token) from an an input
func (*Parser) Accept ¶
Accept will accept a token with one of the given types, returning true if one is read and false otherwise.
func (*Parser) AcceptRun ¶
AcceptRun will keep Accepting tokens as long as they match one of the given types.
It will return the type of the token that made it stop.
func (*Parser) AcceptToken ¶
AcceptToken will accept a token matching one of the ones provided exactly, returning true if one is read and false otherwise.
func (*Parser) Done ¶
func (p *Parser) Done() (Phrase, PhraseFunc)
Done is a PhraseFunc that is used to indicate that there are no more phrases to parse.
func (*Parser) Error ¶
func (p *Parser) Error() (Phrase, PhraseFunc)
Error represents an error state for the phraser.
The error value should be set in Parser.Err and then this func should be called.
func (*Parser) Except ¶
Except will Accept a token that is not one of the types given. Returns true if it Accepted a token.
func (*Parser) ExceptRun ¶
ExceptRun will keep Accepting tokens as long as they do not match one of the given types.
It will return the type of the token that made it stop.
func (*Parser) GetPhrase ¶
GetPhrase runs the state machine and retrieves a single Phrase and possibly an error
func (*Parser) GetToken ¶
GetToken runs the state machine and retrieves a single Token and possibly an error.
If a Token has already been 'peek'ed, that token will be returned without running the state machine
func (*Parser) PhraserState ¶
func (p *Parser) PhraserState(pf PhraseFunc)
PhraserState allows the internal state of the Phraser to be set.
type Phrase ¶
type Phrase struct { Type PhraseType Data []Token }
Phrase represents a collection of tokens that have meaning together
type PhraseFunc ¶
type PhraseFunc func(*Parser) (Phrase, PhraseFunc)
PhraseFunc is the type that the worker types implement in order to be used by the Phraser
type PhraseType ¶
type PhraseType int
PhraseType represnts the type of phrase being read.
Negative values are reserved for this package.
const ( PhraseDone PhraseType = -1 - iota PhraseError )
Constants PhraseError (-2) and PhraseDone (-1)
type TokenFunc ¶
TokenFunc is the type that the worker funcs implement in order to be used by the tokeniser.
type TokenType ¶
type TokenType int
TokenType represents the type of token being read.
Negative values are reserved for this package.
type Tokeniser ¶
type Tokeniser struct { Err error // contains filtered or unexported fields }
Tokeniser is a state machine to generate tokens from an input
func NewByteTokeniser ¶
NewByteTokeniser returns a Tokeniser which uses a byte slice.
Example ¶
package main import ( "fmt" "vimagination.zapto.org/parser" ) func main() { p := parser.NewByteTokeniser([]byte("Hello, World!")) alphaNum := "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz" p.AcceptRun(alphaNum) word := p.Get() fmt.Println("got word:", word) p.ExceptRun(alphaNum) p.Get() p.AcceptRun(alphaNum) word = p.Get() fmt.Println("got word:", word) }
Output: got word: Hello got word: World
func NewReaderTokeniser ¶
NewReaderTokeniser returns a Tokeniser which uses an io.Reader
Example ¶
package main import ( "fmt" "strings" "vimagination.zapto.org/parser" ) func main() { p := parser.NewReaderTokeniser(strings.NewReader("Hello, World!")) alphaNum := "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz" p.AcceptRun(alphaNum) word := p.Get() fmt.Println("got word:", word) p.ExceptRun(alphaNum) p.Get() p.AcceptRun(alphaNum) word = p.Get() fmt.Println("got word:", word) }
Output: got word: Hello got word: World
func NewStringTokeniser ¶
NewStringTokeniser returns a Tokeniser which uses a string.
Example ¶
package main import ( "fmt" "vimagination.zapto.org/parser" ) func main() { p := parser.NewStringTokeniser("Hello, World!") alphaNum := "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz" p.AcceptRun(alphaNum) word := p.Get() fmt.Println("got word:", word) p.ExceptRun(alphaNum) p.Get() p.AcceptRun(alphaNum) word = p.Get() fmt.Println("got word:", word) }
Output: got word: Hello got word: World
func (*Tokeniser) Accept ¶
Accept returns true if the next character to be read is contained within the given string.
Upon true, it advances the read position, otherwise the position remains the same.
func (*Tokeniser) AcceptRun ¶
AcceptRun reads from the string as long as the read character is in the given string.
Returns the rune that stopped the run.
func (*Tokeniser) Done ¶
Done is a TokenFunc that is used to indicate that there are no more tokens to parse.
func (*Tokeniser) Error ¶
Error represents an error state for the parser.
The error value should be set in Tokeniser.Err and then this func should be called.
func (*Tokeniser) Except ¶
Except returns true if the next character to be read is not contained within the given string. Upon true, it advances the read position, otherwise the position remains the same.
func (*Tokeniser) ExceptRun ¶
ExceptRun reads from the string as long as the read character is not in the given string.
Returns the rune that stopped the run.
func (*Tokeniser) Get ¶
Get returns a string of everything that has been read so far and resets the string for the next round of parsing.
func (*Tokeniser) GetToken ¶
GetToken runs the state machine and retrieves a single token and possible an error
func (*Tokeniser) TokeniserState ¶
TokeniserState allows the internal state of the Tokeniser to be set