Documentation ¶
Overview ¶
sqltocsvgzip package converts database query results (in the form of database/sql Rows) into CSV.GZIP output.
Source and README at https://github.com/thatInfrastructureGuy/sqltocsvgzip
Index ¶
- func UploadToS3(rows *sql.Rows) (rowCount int64, err error)
- func WriteFile(csvGzipFileName string, rows *sql.Rows) (rowCount int64, err error)
- type Converter
- func (c *Converter) AddToQueue(buf *bytes.Buffer, lastPart bool)
- func (c *Converter) SetRowPreProcessor(processor CsvPreProcessorFunc)
- func (c *Converter) Upload() (rowCount int64, err error)
- func (c *Converter) UploadObjectToS3(w io.Writer) error
- func (c *Converter) UploadPart() (err error)
- func (c *Converter) Write(w io.Writer) error
- func (c *Converter) WriteFile(csvGzipFileName string) (rowCount int64, err error)
- type CsvPreProcessorFunc
- type LogLevel
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func UploadToS3 ¶ added in v0.0.3
UploadToS3 will upload a CSV.GZIP file to AWS S3 bucket (with headers) based on whatever is in the sql.Rows you pass in. UploadToS3 looks for the following environment variables. Required: S3_BUCKET, S3_PATH, S3_REGION Optional: S3_ACL (default => bucket-owner-full-control)
Types ¶
type Converter ¶
type Converter struct { LogLevel LogLevel Headers []string // Column headers to use (default is rows.Columns()) WriteHeaders bool // Flag to output headers in your CSV (default is true) TimeFormat string // Format string for any time.Time values (default is time's default) Delimiter rune // Delimiter to use in your CSV (default is comma) CsvBufferSize int CompressionLevel int GzipGoroutines int GzipBatchPerGoroutine int S3Bucket string S3Region string S3Acl string S3Path string S3Upload bool UploadThreads int UploadPartSize int RowCount int64 // contains filtered or unexported fields }
Converter does the actual work of converting the rows to CSV. There are a few settings you can override if you want to do some fancy stuff to your CSV.
func UploadConfig ¶ added in v0.0.6
UploadConfig sets the default values for Converter struct.
func WriteConfig ¶ added in v0.0.6
WriteConfig will return a Converter which will write your CSV however you like but will allow you to set a bunch of non-default behaivour like overriding headers or injecting a pre-processing step into your conversion
func (*Converter) AddToQueue ¶ added in v0.0.3
AddToQueue sends obj over the upload queue. Currently, It is designed to work with AWS multipart upload. If the part body is less than 5Mb in size, 2 parts are combined together before sending.
func (*Converter) SetRowPreProcessor ¶
func (c *Converter) SetRowPreProcessor(processor CsvPreProcessorFunc)
SetRowPreProcessor lets you specify a CsvPreprocessorFunc for this conversion
func (*Converter) Upload ¶ added in v0.0.4
Upload uploads the csv.gzip, return an error if problem. Creates a Multipart AWS requests. Completes the multipart request if all uploads are successful. Aborts the operation when an error is received.
func (*Converter) UploadObjectToS3 ¶ added in v0.0.3
UploadObjectToS3 uploads a file to AWS S3 without batching.
func (*Converter) UploadPart ¶ added in v0.0.5
UploadPart listens to upload queue. Whenever an obj is received, it is then uploaded to AWS. Abort operation is called if any error is received.
type CsvPreProcessorFunc ¶
type CsvPreProcessorFunc func(row []string, columnNames []string) (outputRow bool, processedRow []string)
CsvPreprocessorFunc is a function type for preprocessing your CSV. It takes the columns after they've been munged into strings but before they've been passed into the CSV writer.
Return an outputRow of false if you want the row skipped otherwise return the processed Row slice as you want it written to the CSV.