Documentation ¶
Overview ¶
Package shlex provides a simple lexical analysis like Unix shell.
Index ¶
Constants ¶
This section is empty.
Variables ¶
View Source
var ( ErrNoClosing = errors.New("no closing quotation") ErrNoEscaped = errors.New("no escaped character") )
Functions ¶
Types ¶
type DefaultTokenizer ¶
type DefaultTokenizer struct{}
DefaultTokenizer implements a simple tokenizer like Unix shell.
func (*DefaultTokenizer) IsEscape ¶
func (t *DefaultTokenizer) IsEscape(r rune) bool
func (*DefaultTokenizer) IsEscapedQuote ¶
func (t *DefaultTokenizer) IsEscapedQuote(r rune) bool
func (*DefaultTokenizer) IsQuote ¶
func (t *DefaultTokenizer) IsQuote(r rune) bool
func (*DefaultTokenizer) IsWhitespace ¶
func (t *DefaultTokenizer) IsWhitespace(r rune) bool
func (*DefaultTokenizer) IsWord ¶
func (t *DefaultTokenizer) IsWord(r rune) bool
type Lexer ¶
type Lexer struct {
// contains filtered or unexported fields
}
Lexer represents a lexical analyzer.
func NewLexer ¶
NewLexer creates a new Lexer reading from io.Reader. This Lexer has a DefaultTokenizer according to posix and whitespacesplit rules.
func NewLexerString ¶
NewLexerString creates a new Lexer reading from a string. This Lexer has a DefaultTokenizer according to posix and whitespacesplit rules.
func (*Lexer) SetTokenizer ¶
SetTokenizer sets a Tokenizer.
Click to show internal directories.
Click to hide internal directories.