Documentation ¶
Overview ¶
Package obfuscate implements quantizing and obfuscating of tags and resources for a set of spans matching a certain criteria.
Index ¶
- Constants
- type ObfuscatedQuery
- type Obfuscator
- func (o *Obfuscator) Obfuscate(span *pb.Span)
- func (o *Obfuscator) ObfuscateSQLExecPlan(jsonPlan string, normalize bool) (string, error)
- func (o *Obfuscator) ObfuscateSQLString(in string) (*ObfuscatedQuery, error)
- func (o *Obfuscator) ObfuscateStatsGroup(b *pb.ClientGroupedStats)
- func (*Obfuscator) QuantizeRedisString(query string) string
- func (o *Obfuscator) SQLLiteralEscapes() bool
- func (o *Obfuscator) SetSQLLiteralEscapes(ok bool)
- func (o *Obfuscator) Stop()
- type SQLTokenizer
- type SyntaxError
- type TokenKind
Constants ¶
const ( LexError = TokenKind(57346) + iota ID Limit Null String DoubleQuotedString DollarQuotedString // https://www.postgresql.org/docs/current/sql-syntax-lexical.html#SQL-SYNTAX-DOLLAR-QUOTING DollarQuotedFunc // a dollar-quoted string delimited by the tag "$func$"; gets special treatment when feature "dollar_quoted_func" is set Number BooleanLiteral ValueArg ListArg Comment Variable Savepoint PreparedStatement EscapeSequence NullSafeEqual LE GE NE As From Update Insert Into Join TableName ColonCast // FilteredGroupable specifies that the given token has been discarded by one of the // token filters and that it is groupable together with consecutive FilteredGroupable // tokens. FilteredGroupable // FilteredGroupableParenthesis is a parenthesis marked as filtered groupable. It is the // beginning of either a group of values ('(') or a nested query. We track is as // a special case for when it may start a nested query as opposed to just another // value group to be obfuscated. FilteredGroupableParenthesis // Filtered specifies that the token is a comma and was discarded by one // of the filters. Filtered // FilteredBracketedIdentifier specifies that we are currently discarding // a bracketed identifier (MSSQL). // See issue https://github.com/DataDog/datadog-trace-agent/issues/475. FilteredBracketedIdentifier )
list of available tokens; this list has been reduced because we don't need a full-fledged tokenizer to implement a Lexer
const EndChar = unicode.MaxRune + 1
EndChar is used to signal that the scanner has finished reading the query. This happens when there are no more characters left in the query or when invalid encoding is discovered. EndChar is an invalid rune value that can not be found in any valid string.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type ObfuscatedQuery ¶
type ObfuscatedQuery struct { Query string // the obfuscated SQL query TablesCSV string // comma-separated list of tables that the query addresses }
ObfuscatedQuery specifies information about an obfuscated SQL query.
func (*ObfuscatedQuery) Cost ¶ added in v0.9.0
func (oq *ObfuscatedQuery) Cost() int64
Cost returns the number of bytes needed to store all the fields of this ObfuscatedQuery.
type Obfuscator ¶
type Obfuscator struct {
// contains filtered or unexported fields
}
Obfuscator quantizes and obfuscates spans. The obfuscator is not safe for concurrent use.
func NewObfuscator ¶
func NewObfuscator(cfg *config.ObfuscationConfig) *Obfuscator
NewObfuscator creates a new obfuscator
func (*Obfuscator) Obfuscate ¶
func (o *Obfuscator) Obfuscate(span *pb.Span)
Obfuscate may obfuscate span's properties based on its type and on the Obfuscator's configuration.
func (*Obfuscator) ObfuscateSQLExecPlan ¶ added in v0.9.0
func (o *Obfuscator) ObfuscateSQLExecPlan(jsonPlan string, normalize bool) (string, error)
ObfuscateSQLExecPlan obfuscates query conditions in the provided JSON encoded execution plan. If normalize=True, then cost and row estimates are also obfuscated away.
func (*Obfuscator) ObfuscateSQLString ¶
func (o *Obfuscator) ObfuscateSQLString(in string) (*ObfuscatedQuery, error)
ObfuscateSQLString quantizes and obfuscates the given input SQL query string. Quantization removes some elements such as comments and aliases and obfuscation attempts to hide sensitive information in strings and numbers by redacting them.
func (*Obfuscator) ObfuscateStatsGroup ¶ added in v0.9.0
func (o *Obfuscator) ObfuscateStatsGroup(b *pb.ClientGroupedStats)
ObfuscateStatsGroup obfuscates the given stats bucket group.
func (*Obfuscator) QuantizeRedisString ¶ added in v0.9.0
func (*Obfuscator) QuantizeRedisString(query string) string
QuantizeRedisString returns a quantized version of a Redis query.
TODO(gbbr): Refactor this method to use the tokenizer and remove "compactWhitespaces". This method is buggy when commands contain quoted strings with newlines.
func (*Obfuscator) SQLLiteralEscapes ¶
func (o *Obfuscator) SQLLiteralEscapes() bool
SQLLiteralEscapes reports whether escape characters should be treated literally by the SQL obfuscator.
func (*Obfuscator) SetSQLLiteralEscapes ¶
func (o *Obfuscator) SetSQLLiteralEscapes(ok bool)
SetSQLLiteralEscapes sets whether or not escape characters should be treated literally by the SQL obfuscator.
func (*Obfuscator) Stop ¶ added in v0.9.0
func (o *Obfuscator) Stop()
Stop cleans up after a finished Obfuscator.
type SQLTokenizer ¶
type SQLTokenizer struct {
// contains filtered or unexported fields
}
SQLTokenizer is the struct used to generate SQL tokens for the parser.
func NewSQLTokenizer ¶
func NewSQLTokenizer(sql string, literalEscapes bool) *SQLTokenizer
NewSQLTokenizer creates a new SQLTokenizer for the given SQL string. The literalEscapes argument specifies whether escape characters should be treated literally or as such.
func (*SQLTokenizer) Err ¶
func (tkn *SQLTokenizer) Err() error
Err returns the last error that the tokenizer encountered, or nil.
func (*SQLTokenizer) Reset ¶
func (tkn *SQLTokenizer) Reset(in string)
Reset the underlying buffer and positions
func (*SQLTokenizer) Scan ¶
func (tkn *SQLTokenizer) Scan() (TokenKind, []byte)
Scan scans the tokenizer for the next token and returns the token type and the token buffer.
func (*SQLTokenizer) SeenEscape ¶
func (tkn *SQLTokenizer) SeenEscape() bool
SeenEscape returns whether or not this tokenizer has seen an escape character within a scanned string
type SyntaxError ¶
type SyntaxError struct { Offset int64 // error occurred after reading Offset bytes // contains filtered or unexported fields }
A SyntaxError is a description of a JSON syntax error.
func (*SyntaxError) Error ¶
func (e *SyntaxError) Error() string