Documentation ¶
Overview ¶
Package shlex implements a simple lexer which splits input in to tokens using shell-style rules for quoting and commenting.
The basic use case uses the default ASCII lexer to split a string into sub-strings:
shlex.Split("one \"two three\" four") -> []string{"one", "two three", "four"}
To process a stream of strings:
l := NewLexer(os.Stdin) for ; token, err := l.Next(); err != nil { // process token }
To access the raw token stream (which includes tokens for comments):
t := NewTokenizer(os.Stdin) for ; token, err := t.Next(); err != nil { // process token }
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
Types ¶
type Lexer ¶
type Lexer Tokenizer
Lexer turns an input stream into a sequence of tokens. Whitespace and comments are skipped.
type Token ¶
type Token struct {
// contains filtered or unexported fields
}
Token is a (type, value) pair representing a lexographical token.
type TokenType ¶
type TokenType int
TokenType is a top-level token classification: A word, space, comment, unknown.
type Tokenizer ¶
type Tokenizer struct {
// contains filtered or unexported fields
}
Tokenizer turns an input stream into a sequence of typed tokens
func NewTokenizer ¶
NewTokenizer creates a new tokenizer from an input stream.