datasource

package
v1.6.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 12, 2024 License: Apache-2.0 Imports: 2 Imported by: 8

README

Datasource

GoFr provides following features to ensure robust and observable interactions with various data sources:

  1. Health Checks

A mechanism for a datasource to self-report its operational status. New datasources require implementing the HealthCheck() method with the signature:

HealthCheck() datasource.Health

This method should return the current health status of the datasource.

  1. Retry Mechanism

GoFr attempts to re-establish connections if lost during application runtime. New datasources should be verified for built-in retry mechanisms. If absent, implement a mechanism for automatic reconnection.

  1. Metrics

Datasources should expose relevant metrics for performance monitoring. The specific metrics to be implemented depend on the datasource type. Discussions are required to determine the appropriate metrics for each new datasource.

  1. Logging

GoFr supports level-based logging with the PrettyPrint interface. New datasources should implement logging with the following levels:

  • DEBUG: Logs connection attempts with critical details.
  • INFO: Logs successful connection establishment.

Additional logs can be added to enhance debugging and improving user experience.

  1. Tracing

GoFr supports tracing for all the datasouces, for example for SQL it traces the request using github.com/XSAM/otelsql. If any official package or any widely used package is not available we have to implement our own, but in scope of a different ISSUE.

All logs should include:

  • Timestamp
  • Request ID (Correlation ID)
  • Time taken to execute the query
  • Datasource name (consistent with other logs)

Implementing New Datasources

GoFr offers built-in support for popular datasources like SQL (MySQL, PostgreSQL), Redis, and Pub/Sub (MQTT, Kafka, Google as backend). Including additional functionalities within the core GoFr binary would increase the application size unnecessarily.

Therefore, GoFr utilizes a pluggable approach for new datasources by separating implementation in the following way:

  • Interface Definition:

    Create an interface with required methods within the datasource package. Register the interface with the container (similar to Mongo in https://github.com/tfogo/mongodb-go-tutorial).

  • Method Registration:

    Create a method in gofr.go (similar to the existing one) that accepts the newly defined interface.

  • Separate Repository:

    Develop a separate repository to implement the interface for the new datasource. This approach ensures that the new datasource dependency is only loaded when utilized, minimizing binary size for GoFr applications. It also empowers users to create custom implementations beyond the defaults provided by GoFr.

Supported Datasources

Datasource Health-Check Logs Metrics Traces As Driver
MySQL
REDIS
PostgreSQL
MongoDB

Following list has to be update when PR for new Datasource is being created.

Documentation

Index

Constants

View Source
const (
	StatusUp   = "UP"
	StatusDown = "DOWN"
)

Variables

This section is empty.

Functions

This section is empty.

Types

type Datasource added in v0.2.0

type Datasource interface {
	Register(config config.Config)
}

type Health

type Health struct {
	Status  string                 `json:"status,omitempty"`
	Details map[string]interface{} `json:"details,omitempty"`
}

type Logger

type Logger interface {
	Debug(args ...interface{})
	Debugf(format string, args ...interface{})
	Log(args ...interface{})
	Logf(format string, args ...interface{})
	Error(args ...interface{})
	Errorf(format string, args ...interface{})
}

Logger interface is used by datasource packages to log information about query execution. Developer Notes: Note that it's a reduced version of logging.Logger interface. We are not using that package to ensure that datasource package is not dependent on logging package. That way logging package should be easily able to import datasource package and provide a different "pretty" version for different log types defined here while avoiding the cyclical import issue. Idiomatically, interfaces should be defined by packages who are using it; unlike other languages. Also - accept interfaces, return concrete types.

type Mongo added in v1.3.0

type Mongo interface {
	// Find executes a query to find documents in a collection based on a filter and stores the results
	// into the provided results interface.
	Find(ctx context.Context, collection string, filter interface{}, results interface{}) error

	// FindOne executes a query to find a single document in a collection based on a filter and stores the result
	// into the provided result interface.
	FindOne(ctx context.Context, collection string, filter interface{}, result interface{}) error

	// InsertOne inserts a single document into a collection.
	// It returns the identifier of the inserted document and an error, if any.
	InsertOne(ctx context.Context, collection string, document interface{}) (interface{}, error)

	// InsertMany inserts multiple documents into a collection.
	// It returns the identifiers of the inserted documents and an error, if any.
	InsertMany(ctx context.Context, collection string, documents []interface{}) ([]interface{}, error)

	// DeleteOne deletes a single document from a collection based on a filter.
	// It returns the number of documents deleted and an error, if any.
	DeleteOne(ctx context.Context, collection string, filter interface{}) (int64, error)

	// DeleteMany deletes multiple documents from a collection based on a filter.
	// It returns the number of documents deleted and an error, if any.
	DeleteMany(ctx context.Context, collection string, filter interface{}) (int64, error)

	// UpdateByID updates a document in a collection by its ID.
	// It returns the number of documents updated and an error if any.
	UpdateByID(ctx context.Context, collection string, id interface{}, update interface{}) (int64, error)

	// UpdateOne updates a single document in a collection based on a filter.
	// It returns an error if any.
	UpdateOne(ctx context.Context, collection string, filter interface{}, update interface{}) error

	// UpdateMany updates multiple documents in a collection based on a filter.
	// It returns the number of documents updated and an error if any.
	UpdateMany(ctx context.Context, collection string, filter interface{}, update interface{}) (int64, error)

	// CountDocuments counts the number of documents in a collection based on a filter.
	// It returns the count and an error if any.
	CountDocuments(ctx context.Context, collection string, filter interface{}) (int64, error)

	// Drop an entire collection from the database.
	// It returns an error if any.
	Drop(ctx context.Context, collection string) error
}

Mongo is an interface representing a MongoDB database client with common CRUD operations.

Directories

Path Synopsis
cassandra module
clickhouse module
dgraph module
file
ftp Module
s3 Module
sftp Module
kv-store
badger Module
mongo module
Package pubsub provides a foundation for implementing pub/sub clients for various message brokers such as google pub-sub, kafka and MQTT.
Package pubsub provides a foundation for implementing pub/sub clients for various message brokers such as google pub-sub, kafka and MQTT.
google
Package google provides a client for interacting with Google Cloud Pub/Sub.This package facilitates interaction with Google Cloud Pub/Sub, allowing publishing and subscribing to topics, managing subscriptions, and handling messages.
Package google provides a client for interacting with Google Cloud Pub/Sub.This package facilitates interaction with Google Cloud Pub/Sub, allowing publishing and subscribing to topics, managing subscriptions, and handling messages.
kafka
Package kafka provides a client for interacting with Apache Kafka message queues.This package facilitates interaction with Apache Kafka, allowing publishing and subscribing to topics, managing consumer groups, and handling messages.
Package kafka provides a client for interacting with Apache Kafka message queues.This package facilitates interaction with Apache Kafka, allowing publishing and subscribing to topics, managing consumer groups, and handling messages.
mqtt
Package mqtt provides a client for interacting with MQTT message brokers.This package facilitates interaction with MQTT brokers, allowing publishing and subscribing to topics, managing subscriptions, and handling messages.
Package mqtt provides a client for interacting with MQTT message brokers.This package facilitates interaction with MQTT brokers, allowing publishing and subscribing to topics, managing subscriptions, and handling messages.
eventhub Module
nats Module
Package redis provides a client for interacting with Redis key-value stores.This package allows creating and managing Redis clients, executing Redis commands, and handling connections to Redis databases.
Package redis provides a client for interacting with Redis key-value stores.This package allows creating and managing Redis clients, executing Redis commands, and handling connections to Redis databases.
solr module
Package sql provides functionalities to interact with SQL databases using the database/sql package.This package includes a wrapper around sql.DB and sql.Tx to provide additional features such as query logging, metrics recording, and error handling.
Package sql provides functionalities to interact with SQL databases using the database/sql package.This package includes a wrapper around sql.DB and sql.Tx to provide additional features such as query logging, metrics recording, and error handling.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL