parser

package
v0.0.0-...-928eb7d Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 7, 2020 License: MIT Imports: 3 Imported by: 0

Documentation

Overview

Package parser finds the pages in a slice of tokens from tokenizer.

A normal page is /t+(co+)*c/, although technically the last clear is not part of the page. There are many corner cases, but those are documented only in the tests.

This package expects 'a' to be tokenized as 't', in contrast to what the tokenizer package produces by default.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Page

type Page struct {
	Head, Body Part
}

Page contains the parts for a page. In the case of empty headers or bodies, the empty part is set to zero length (low == high), with the value chosen so head.low:body.high slices all lines inside the page.

func Parse

func Parse(tokens []tokenizer.Token) (pages []Page)

Parse finds the pages in tokens.

type Part

type Part struct {
	Low, High int
}

Part represents a header or body. It contains the values that would slice the part's lines out of a slice of all lines in the file.

type TokenError

type TokenError struct {
	Token tokenizer.Token
	Index int
}

TokenError represents an invalid token given for a certain line. It is used by Parse when panicking.

func (TokenError) Error

func (err TokenError) Error() string

Error satisfies the error interface.

Directories

Path Synopsis
Command generator creates machine.go from machine.txt
Command generator creates machine.go from machine.txt

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL