performance

package
v0.9.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 17, 2019 License: Apache-2.0 Imports: 7 Imported by: 0

README

Knative Eventing Performance Tests

Getting Started

  1. Create a namespace or use an existing namespace.

Create a ConfigMap called config-mako in your chosen namespace.

cat <<EOF | kubectl apply -n <namespace> -f -
apiVersion: v1
kind: ConfigMap
metadata:
  name: config-mako
data:
  environment: dev
EOF

NewConfigFromMap determines the valid keys in this ConfigMap. Current keys are:

  • environment: Selects a Mako config file in kodata. E.g. environment: dev corresponds to kodata/dev.config.
  • additionalTags: Comma-separated list of tags to apply to the Mako run.

Running a benchmark

  1. Use ko to apply yaml files in the benchmark directory.
ko apply -f test/performance/broker-latency

Documentation

Index

Constants

View Source
const (
	TestResultKey  = "RESULT"
	TestPass       = "PASS"
	TestFail       = "FAIL"
	TestFailReason = "FAIL_REASON"
)

constants used in the pod logs, which we then use to build test results. TODO(Fredy-Z): we'll need a more robust way to export test results from the pod.

Variables

This section is empty.

Functions

func CreatePerfTestCase

func CreatePerfTestCase(metricValue float32, metricName, testName string) junit.TestCase

CreatePerfTestCase creates a perf test case with the provided name and value

func ParseTestResultFromLog

func ParseTestResultFromLog(client *test.KubeClient, podName, containerName, namespace string) (map[string]string, error)

ParseTestResultFromLog will parse the test result from the pod log TODO(Fredy-Z): this is very hacky and error prone, we need to find a better way to get the result.

Probably write the logs in JSON format as zipkin tracing.

Types

This section is empty.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL