lookup

command
v1.1.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 29, 2022 License: MIT Imports: 10 Imported by: 0

README

Lookup integration test

Created using Ubuntu WSL. Other Linux flavors and MacOS may require edits.

Workflow

The DOT diagram generated with

go run toolbelt.go validate_script -script_file=../../../test/data/cfg/lookup/script.json -params_file=../../../test/data/cfg/lookup/script_params_two_runs.json -idx_dag=true

and rendered in https://dreampuf.github.io/GraphvizOnline :

drawing

What's tested:

  • table_lookup_table with parallelism (10 batches), all suported types of joins (inner and left outer, grouped and not)
  • file_table read from single file
  • table_file with top/limit/order
  • single-run (test_one_run.sh) and multi-run (test_two_runs.sh) script execution

Multi-run test simulates the scenario when an operator validates loaded order and order item data before proceeding with joining orders with order items.

How to test

Direct node execution

Run test_exec_nodes.sh - the Toolbelt executes script nodes one by one, without invoking RabbitMQ workflow.

Using RabbitMQ workflow (single run)

Make sure the Daemon is running:

  • either run go run daemon.go to start it in pkg/exe/daemon
  • or start the Daemon container (docker compose -p "test_capillaries_containers" start daemon)

Run test_one_run.sh - the Toolbelt publishes batch messages to RabbitMQ and the Daemon consumes them and executes all script nodes in parallel as part of a single run.

Using RabbitMQ workflow (two runs)

Make sure the Daemon is running:

  • either run go run daemon.go to start it in pkg/exe/daemon
  • or start the Daemon container (docker compose -p "test_capillaries_containers" start daemon)

Run test_two_runs.sh - the Toolbelt publishes batch messages to RabbitMQ and the Daemon consumes them and executes script nodes that load data from files as part of the first run.

After the first run is complete, the Toolbelt publishes batch messages to RabbitMQ and the Daemon consumes them and executes script nodes that process the data as part of the second run.

This test mimics the "operator validation" scenario.

Webapi

Make sure the Daemon is running:

  • either run go run daemon.go to start it in pkg/exe/daemon
  • or start the Daemon container (docker compose -p "test_capillaries_containers" start daemon)

Make sure the Webapi is running:

  • either run go run webapi.go to start it in pkg/exe/webapi
  • or start the Webapi container (docker compose -p "test_capillaries_containers" start webapi)

The test runs the same scenario as the previous two runs test above, but uses Webapi instead of the Toolbelt

Possible edits

Play with number of total line items (see "-items=..." in 1_create_test_data.sh).

References:

Data model design: Brazilian E-Commerce public dataset (https://www.kaggle.com/datasets/olistbr/brazilian-ecommerce)

Documentation

The Go Gopher

There is no documentation for this package.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL