Documentation ¶
Index ¶
- func MapReduce(file io.Reader, parser StringParser, reducer Reducer) chan *Entry
- type Avg
- type Chain
- type Count
- type Datetime
- type Entry
- func (entry *Entry) Field(name string) (value string, err error)
- func (entry *Entry) Fields() Fields
- func (entry *Entry) FieldsHash(fields []string) string
- func (entry *Entry) FloatField(name string) (value float64, err error)
- func (entry *Entry) IntField(name string) (value int64, err error)
- func (entry *Entry) Merge(merge *Entry)
- func (entry *Entry) Partial(fields []string) *Entry
- func (entry *Entry) SetField(name string, value string)
- func (entry *Entry) SetFloatField(name string, value float64)
- func (entry *Entry) SetUintField(name string, value uint64)
- type Fields
- type Filter
- type GroupBy
- type Parser
- type ReadAll
- type Reader
- type Reducer
- type StringParser
- type Sum
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func MapReduce ¶ added in v1.1.0
func MapReduce(file io.Reader, parser StringParser, reducer Reducer) chan *Entry
MapReduce iterates over given file and map each it's line into Entry record using parser and apply reducer to the Entries channel. Execution terminates when result will be readed from reducer's output channel, but the mapper works and fills input Entries channel until all lines will be read from the fiven file.
Types ¶
type Avg ¶ added in v1.2.0
type Avg struct {
Fields []string
}
Avg implements the Reducer interface for average entries values calculation
type Chain ¶ added in v1.2.0
type Chain struct {
// contains filtered or unexported fields
}
Chain implements the Reducer interface for chaining other reducers
type Count ¶ added in v1.2.0
type Count struct { }
Count implements the Reducer interface to count entries
type Datetime ¶ added in v1.3.0
Datetime implements the Filter interface to filter Entries with timestamp fields within the specified datetime interval.
type Entry ¶
type Entry struct {
// contains filtered or unexported fields
}
Entry is a parsed log record. Use Get method to retrieve a value by name instead of threating this as a map, because inner representation is in design.
func NewEmptyEntry ¶ added in v1.2.0
func NewEmptyEntry() *Entry
NewEmptyEntry creates an empty Entry to be filled later
func (*Entry) Field ¶ added in v1.2.0
Field returns an entry field value by name or empty string and error if it does not exist.
func (*Entry) FieldsHash ¶ added in v1.2.0
FieldsHash returns a hash of all fields
func (*Entry) FloatField ¶ added in v1.2.0
FloatField returns an entry field value as float64. Return nil if field does not exist and conversion error if cannot cast a type.
func (*Entry) IntField ¶ added in v1.4.0
IntField returns an entry field value as float64. Return nil if field does not exist and conversion error if cannot cast a type.
func (*Entry) Merge ¶ added in v1.2.0
Merge two entries by updating values for master entry with given.
func (*Entry) Partial ¶ added in v1.2.0
Partial returns a partial field entry with the specified fields
func (*Entry) SetFloatField ¶ added in v1.2.0
SetFloatField is a Float field value setter. It accepts float64, but still store it as a string in the same fields map. The precision is 2, its enough for log parsing task
func (*Entry) SetUintField ¶ added in v1.2.0
SetUintField is a Integer field value setter. It accepts float64, but still store it as a string in the same fields map.
type Filter ¶ added in v1.3.0
Filter interface for Entries channel limiting.
Filter method should accept *Entry and return *Entry if it meets filter condition, otherwise it returns nil.
type GroupBy ¶ added in v1.2.0
type GroupBy struct { Fields []string // contains filtered or unexported fields }
GroupBy implements the Reducer interface to apply other reducers and get data grouped by given fields.
func NewGroupBy ¶ added in v1.2.0
NewGroupBy creates a new GroupBy Reducer
type Parser ¶ added in v1.1.0
type Parser struct {
// contains filtered or unexported fields
}
Parser is a log record parser. Use specific constructors to initialize it.
func NewNginxParser ¶ added in v1.1.0
NewNginxParser parses the nginx conf file to find log_format with the given name and returns a parser for this format. It returns an error if cannot find the given log format.
type ReadAll ¶ added in v1.1.0
type ReadAll struct { }
ReadAll implements the Reducer interface for simple input entries redirected to the output channel.
type Reader ¶
type Reader struct {
// contains filtered or unexported fields
}
Reader is a log file reader. Use specific constructors to create it.
func NewNginxReader ¶
func NewNginxReader(logFile io.Reader, nginxConf io.Reader, formatName string) (reader *Reader, err error)
NewNginxReader creates a reader for the nginx log format. Nginx config parser will be used to get particular format from the conf file.
func NewParserReader ¶ added in v1.4.0
func NewParserReader(logFile io.Reader, parser StringParser) *Reader
NewParserReader creates a reader with the given parser
type Reducer ¶ added in v1.1.0
Reducer interface for Entries channel redure.
Each Reduce method should accept input channel of Entries, do it's job and the result should be written to the output channel.
It does not return values because usually it runs in a separate goroutine and it is handy to use channel for reduced data retrieval.
type StringParser ¶ added in v1.3.0
StringParser is the interface that wraps the ParseString method.
Source Files ¶
Directories ¶
Path | Synopsis |
---|---|
Example program that reads big nginx file from stdin line by line and measure reading time.
|
Example program that reads big nginx file from stdin line by line and measure reading time. |
example
|
|