Documentation ¶
Overview ¶
Package tokenizer converts LaTeX text input into a stream of tokens.
The package can perform macro expansion and has built-in knowledge about the macros and environments defined by different LaTeX classes and packages.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type MissingEndError ¶
type MissingEndError string
MissingEndError is used to indicate unterminated maths and tikz environments.
func (MissingEndError) Error ¶
func (err MissingEndError) Error() string
type Token ¶
type Token struct { // Type describes which kind of token this is. Type TokenType // For TokenMacro, this is the name of the macro, including the // leading backslash. For most other token types, this is the // textual content of the token. Name string // For tokens of type TokenMacro, this field specifies the values // of the macro arguments. Unused for all other token types. Args []*Arg }
Token contains information about a single syntactic unit in the TeX source.
type TokenList ¶
type TokenList []*Token
TokenList describes tokenized data in the argument of a macro call.
func (TokenList) FormatMaths ¶
FormatMaths converts a TokenList to a string, assuming maths mode. Redundant spaces are omitted by this method.
func (TokenList) FormatText ¶
FormatText converts a TokenList to a string.
type Tokenizer ¶
A Tokenizer can be used to split a LaTeX file into syntactic units. User-defined macros are expanded in the process.
func NewTokenizer ¶
func NewTokenizer() *Tokenizer
NewTokenizer creates and initialises a new Tokenizer.