Documentation
¶
Overview ¶
Package obfuscate implements quantizing and obfuscating of tags and resources for a set of spans matching a certain criteria.
Index ¶
Constants ¶
const ( EOFChar = 0x100 LexError = 57346 ID = 57347 Limit = 57348 Null = 57349 String = 57350 Number = 57351 BooleanLiteral = 57352 ValueArg = 57353 ListArg = 57354 Comment = 57355 Variable = 57356 Savepoint = 57357 PreparedStatement = 57358 EscapeSequence = 57359 NullSafeEqual = 57360 LE = 57361 GE = 57362 NE = 57363 As = 57365 // Filtered specifies that the given token has been discarded by one of the // token filters. Filtered = 57364 // FilteredComma specifies that the token is a comma and was discarded by one // of the filters. FilteredComma = 57366 )
list of available tokens; this list has been reduced because we don't need a full-fledged tokenizer to implement a Lexer
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type DiscardFilter ¶
type DiscardFilter struct{}
DiscardFilter implements the TokenFilter interface so that the given token is discarded or accepted.
func (*DiscardFilter) Filter ¶
func (f *DiscardFilter) Filter(token, lastToken int, buffer []byte) (int, []byte)
Filter the given token so that a `nil` slice is returned if the token is in the token filtered list.
func (*DiscardFilter) Reset ¶
func (f *DiscardFilter) Reset()
Reset in a DiscardFilter is a noop action
type GroupingFilter ¶
type GroupingFilter struct {
// contains filtered or unexported fields
}
GroupingFilter implements the TokenFilter interface so that when a common pattern is identified, it's discarded to prevent duplicates
func (*GroupingFilter) Filter ¶
func (f *GroupingFilter) Filter(token, lastToken int, buffer []byte) (int, []byte)
Filter the given token so that it will be discarded if a grouping pattern has been recognized. A grouping is composed by items like:
- '( ?, ?, ? )'
- '( ?, ? ), ( ?, ? )'
func (*GroupingFilter) Reset ¶
func (f *GroupingFilter) Reset()
Reset in a GroupingFilter restores variables used to count escaped token that should be filtered
type Obfuscator ¶
type Obfuscator struct {
// contains filtered or unexported fields
}
Obfuscator quantizes and obfuscates spans. The obfuscator is not safe for concurrent use.
func NewObfuscator ¶
func NewObfuscator(cfg *config.ObfuscationConfig) *Obfuscator
NewObfuscator creates a new Obfuscator.
func (*Obfuscator) Obfuscate ¶
func (o *Obfuscator) Obfuscate(span *model.Span)
Obfuscate may obfuscate span's properties based on its type and on the Obfuscator's configuration.
type ReplaceFilter ¶
type ReplaceFilter struct{}
ReplaceFilter implements the TokenFilter interface so that the given token is replaced with '?' or left unchanged.
func (*ReplaceFilter) Filter ¶
func (f *ReplaceFilter) Filter(token, lastToken int, buffer []byte) (int, []byte)
Filter the given token so that it will be replaced if in the token replacement list
func (*ReplaceFilter) Reset ¶
func (f *ReplaceFilter) Reset()
Reset in a ReplaceFilter is a noop action
type SyntaxError ¶
type SyntaxError struct { Offset int64 // error occurred after reading Offset bytes // contains filtered or unexported fields }
A SyntaxError is a description of a JSON syntax error.
func (*SyntaxError) Error ¶
func (e *SyntaxError) Error() string
type TokenConsumer ¶
type TokenConsumer struct {
// contains filtered or unexported fields
}
TokenConsumer is a Tokenizer consumer. It calls the Tokenizer Scan() function until tokens are available or if a LEX_ERROR is raised. After retrieving a token, it is sent in the TokenFilter chains so that the token is discarded or replaced.
func NewTokenConsumer ¶
func NewTokenConsumer(filters []TokenFilter) *TokenConsumer
NewTokenConsumer returns a new TokenConsumer capable to process SQL and No-SQL strings.
func (*TokenConsumer) Process ¶
func (t *TokenConsumer) Process(in string) (string, error)
Process the given SQL or No-SQL string so that the resulting one is properly altered. This function is generic and the behavior changes according to chosen TokenFilter implementations. The process calls all filters inside the []TokenFilter.
func (*TokenConsumer) Reset ¶
func (t *TokenConsumer) Reset()
Reset restores the initial states for all components so that memory can be re-used
type TokenFilter ¶
TokenFilter is a generic interface that a TokenConsumer expects. It defines the Filter() function used to filter or replace given tokens. A filter can be stateful and keep an internal state to apply the filter later; this can be useful to prevent backtracking in some cases.
type Tokenizer ¶
type Tokenizer struct { InStream *strings.Reader Position int // contains filtered or unexported fields }
Tokenizer is the struct used to generate SQL tokens for the parser.
func NewStringTokenizer ¶
NewStringTokenizer creates a new Tokenizer for the sql string.