Skip to main content

SPL2 Testing Framework

Project description

SPL2 Testing Framework

Overview

The SPL2 Testing Framework enables running SPL2 tests both locally ( or on any Splunk instance with SPL2 orchestrator), remotely (using external cloud environments) or using cli.

  • For Cloud: It uses the Search Service API.
  • For Splunk: It uses the Splunk Search API (with SPL2 support)
  • For cli - it uses the internal spl2-processor-cli library. This option is not available for public usage, as spl2-processor-cli is a Splunk internal tool.

Prerequisites

1. Install Python and Poetry

  1. Ensure python3.x is available
  2. Install testing framework:
    • Install poetry and execute the command poetry install to create a virtual environment and install required dependencies.
    • OR
    • Install the library using pip: pip install spl2-testing-framework

2. Set configuration. It may be done in spl2_test_config.json file or environment variables

Note: setting configuration is necessary for running tests using splunk or cloud environment. For running tests using cli no configuration is required. The only requirement is to have this library installed, as described below

spl2_test_config.json - local file present in the current working directory (the directory from where tests are executed)

Configuration for running tests using cloud search client (Ingest processor)
  • cloud_instance - address of Cloud host where the tests can be executed
    • e.g.: staging.scs.splunk.com
  • tenant - tenant to use for testing
    • e.g.: spl2-content
  • bearer_token - token used for authentication. To obtain the token, go to: https://console.[cloud_instance]/[tenant]/settings
Configuration for running tests using splunk search client (Splunk instance)
  • host - address of Splunk host where the tests can be executed
    • e.g.: localhost or https://10.202.35.219
  • port - port of Splunk host where the tests can be executed
    • usually 8089, but can be different
  • user - user to authenticate
  • password - password to authenticate

The same configuration can be done using environment variables(however, spl2_test_config.json has higher priority):

cloud search client
  • SPL2_TF_CLOUD_INSTANCE => cloud_instance
  • SPL2_TF_TENANT => tenant
  • SPL2_TF_BEARER_TOKEN => bearer_token
splunk search client
  • SPL2_TF_HOST => host
  • SPL2_TF_PORT => port
  • SPL2_TF_USER => user
  • SPL2_TF_PASSWORD => password

3. Installing spl2-processor-cli (Splunk internal tool)

spl2-processor-cli can be installed using brew:

brew install spl2-processor-cli

Before installation, it may be necessary to authenticate to artifactory by running:

okta-artifactory-login -t generic

Running tests

To run tests, execute the command:

spl2_tests_run [cli|splunk|cloud]

In the directory where the tests are located. Test discovery is recursive, so it's possible to run tests even from the root directory of the project.

It is possible to pass more options to the command, which works also with pytest, e.g.:

  • -k "filter" - to run only tests which name contains "filter"
  • -v[vv] - to see more verbose output
  • -n [auto|<number>] - to run tests in parallel
    • auto - to use all available cores,
    • <number> - to use specific number of cores
    • however, it's recommended to run tests in parallel on cli mostly, as running on splunk or cloud doesn't give significant performance improvement
  • -x - to stop on first failure
  • -pdb - to enter debugger on failure
  • ... and much more, whatever is supported by pytest

Additionally, the following options are supported:

  • --test_dir - directory where test files (.test.json, .test.spl2, module.json) are located. Defaults to the current working directory. Discovery is recursive.

  • --code_dir - directory where SPL2 code modules (.spl2) are located. Defaults to the current working directory. Falls back to --test_dir if the module is not found. See Separate code and test directories for details.

  • --ignore_empty_strings - to ignore empty strings in the results

  • --ignore_additional_fields_in_actual - to ignore fields present in actual results but not in expected results (useful when actual results contain extra fields that should not affect comparison)

  • --create_comparison_sheet - to create a comparison sheet in comparison_box_test folder using actual and expected outputs (works only when running box tests)

  • --cli_bench - when running tests with CLI, this option enables benchmarking mode by adding -b and -n flags to the spl2-processor-cli command. The value specifies the number of events to test (e.g., --cli_bench=1000 will add -b -n 1000 to the CLI command). In bench mode, all tests always succeed and CLI output is printed to logs - this is useful for performance testing and benchmarking. This is only applicable when using --type cli.

  • --check_splunk_results - option to validate box test output by sending events to a Splunk instance (using HEC) and querying them back, with support for CIM (Common Information Model) and TA (Technology Add-on) checks.

  • Note: The pytest.ini.sample file allows you to define command parameters. Just update the configurations, rename the file by removing the .sample extension, and execute the command.

Run tests in IDE [PyCharm]

It's also possible to run tests in PyCharm. To do this, it's necessary to set Run Configurations

Sample configuration which may be used:

  • Run configuration
    • Type: Python test
    • Module: spl2_testing_framework.test_runner
    • Parameters: --type [cli | splunk | cloud] --test_dir /tests/resources -o log_cli=true --log-cli-level=INFO --verbose
    • If test dir is not specified, current working directory will be used
    • Use --code_dir if SPL2 code modules are in a separate directory from tests
    • If necessary another pytest options can be added

Note: It's necessary to set "pytest" as default test runner in PyCharm settings

Separate code and test directories

By default, the framework expects SPL2 code modules (.spl2 files) to live alongside their test files. The --code_dir option allows you to keep code and tests in separate directory trees.

How it works

What is searched Primary directory Fallback directory
Code modules (.spl2) --code_dir --test_dir
Test files (.test.json, .test.spl2) --test_dir --code_dir

Both searches are recursive. If the primary directory has no results, the framework falls back to the other directory and logs the action.

When multiple files with the same name are found within a directory, a warning is logged and the lexicographically first match is used.

Example project layout

my-project/
├── src/                          # --code_dir
│   ├── network_traffic.spl2
│   └── dns_lookup.spl2
└── tests/                        # --test_dir
    ├── network_traffic.test.json
    ├── dns_lookup.test.json
    └── dns_lookup.test.spl2

Run with:

spl2_tests_run cli --test_dir tests --code_dir src

Backward compatibility

When --code_dir is not specified (or both flags point to the same directory), the framework behaves exactly as before — code modules are found next to their test files via the same recursive search.

Applies to all test types

The --code_dir option works for box tests, unit tests, and single SPL2 file execution.

Executing a spl2 file

This framework also supports executing a single spl2 file and prints the results in command line as well as a log file. This will help developers to get the results of the spl2 pipeline as and when they are developing the pipeline.

It requires 3 additional parameters:

  • --template_file
  • --sample_file
  • --sample_delimiter

It will execute the template_file provided in the --template_file parameter. It will read samples if --sample_file parameter is provided and will separate the samples by using --sample_delimiter. If --sample_file is not provided, then it will look for the samples in the respective module.json file corresponding to the template_file.

To run a single spl2 file, execute the command:

single_spl2_file_run [cli|splunk|cloud]

It is possible to pass more options to the command, which works also with pytest, e.g.:

  • --test_dir - Path in which the template file and module.json are available. If not provided, it will look for the current directory for the template file and module.json file

  • --code_dir - Path where SPL2 code modules are located, if different from --test_dir. Falls back to --test_dir when not set. See Separate code and test directories.

  • --template_file - The spl2 template file to execute

  • --sample_file - A file containing all the samples required for the template file. If not provided, it will look for the samples in module.json file of the corresponding template file

  • --sample_delimiter - Separator for separating the samples provided in the sample file. If not provided, it will use newline as a default separator.

  • --limit_tests - Limit number of tests to execute from the unit, sample file or module.json. If not provided, all will be executed. This can be used for quick testing of spl2 files during development.

  • ... and much more, whatever is supported by pytest

Note: The pytest.ini.sample file allows you to define command parameters. Just update the configurations, rename the file by removing the .sample extension, and execute the command.

Performance check

It is possible to measure execution time of spl2 pipeline, or even do more advanced time checks using flag:

  • --performance_check=time - to run basic time measurements - time of execution of spl2 pipeline will be printed to stdout
  • --performance_check=detailed_time - to do more advanced time checks which injects more timestamps into spl2 pipeline.

Running detailed_time check creates text file with spl2 pipeline code with injected timestamps after every command ("|") Content of this file will also be printed to stdout.

This checks can be applied only to box tests, as assertions which are used in unit tests may impact spl2 pipeline performance.

Check splunk results

When set (--check_splunk_results=<option>), box tests (after running the SPL2 pipeline) send pipeline output to the configured Splunk instance and validate results there instead of comparing only in-memory expected vs actual.

Value Behavior
CIM For each output event, push to Splunk, query by host, then validate that the returned event has the expected CIM fields (from expected_cim_fields.cim_fields in the box test). Uses validate_compatibility(). Skips the test if the box test has no cim_fields.
TA For each output event, push to Splunk, query by host, then assert that the returned event equals the expected output event for that index.

For TA only, you can name top-level fields to drop on both the queried event and each expected_destination_result row by adding ignore_fields_in_splunk_check: an object mapping field name → short reason. Use the same key under test in each entry in *.test.json (same file as expected_destination_result):

"test": {
  "source": "...",
  "expected_destination_result": [ ... ],
  "ignore_fields_in_splunk_check": {
    "_raw": "Raw event differs after Splunk indexing",
    "_time": "Timestamp may vary due to ingestion latency"
  }
}

If the key is omitted, no fields are ignored. Used only when --check_splunk_results=TA (ignored for other modes and for the default in-memory box test assert). If the file or key is missing, no fields are ignored.

Configuration:

SPLUNK_INSTANCE in spl2_test_config.json

New top-level key for the Splunk instance used by --check_splunk_results:

{
  "SPLUNK_INSTANCE": {
    "ip": "<hostname-or-ip>",
    "port": 8088,
    "api_port": 8089,
    "username": "<splunk-user>",
    "password": "<splunk-password>",
    "index": "<index-name>",
    "hec_token": "<hec-token>"
  }
}
  • ip – Splunk server host (e.g. localhost or hostname).
  • port – HEC port (e.g. 8088).
  • api_port – Management/API port (e.g. 8089) for Search API and index creation.
  • username / password – Used for Search API and index creation (HTTP Basic).
  • index – Index where HEC events are sent and then searched.
  • hec_token – Token for HEC (Authorization: Splunk <token>).

Do not commit real credentials. Prefer environment variables (below) or a local config that is not in version control.

Environment variables

Any SPLUNK_INSTANCE field can be set via the environment; config file values override these only when non-empty.

Variable Maps to
SPL2_TF_SPLUNK_INSTANCE_IP ip
SPL2_TF_SPLUNK_INSTANCE_PORT port
SPL2_TF_SPLUNK_INSTANCE_API_PORT api_port
SPL2_TF_SPLUNK_INSTANCE_USERNAME username
SPL2_TF_SPLUNK_INSTANCE_PASSWORD password
SPL2_TF_SPLUNK_INSTANCE_INDEX index
SPL2_TF_SPLUNK_INSTANCE_HEC_TOKEN hec_token

Format all files

To format all files run:

black spl2_testing_framework tests

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spl2_testing_framework-1.7.0.tar.gz (40.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

spl2_testing_framework-1.7.0-py3-none-any.whl (57.1 kB view details)

Uploaded Python 3

File details

Details for the file spl2_testing_framework-1.7.0.tar.gz.

File metadata

  • Download URL: spl2_testing_framework-1.7.0.tar.gz
  • Upload date:
  • Size: 40.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.7.17 Linux/6.8.0-1044-azure

File hashes

Hashes for spl2_testing_framework-1.7.0.tar.gz
Algorithm Hash digest
SHA256 5acd11ee157dc4123fa231623dadaa8aff5d300f6f7e07468e5707e9eb4e2f1f
MD5 5f2fbbf3ce1e896270aefd8dece57084
BLAKE2b-256 e41b8e4a784c88b164da35bcc6ce2dc8b26fd93019bfc8c30cc3b13b1f29cf0f

See more details on using hashes here.

File details

Details for the file spl2_testing_framework-1.7.0-py3-none-any.whl.

File metadata

File hashes

Hashes for spl2_testing_framework-1.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ac99bd9d1b6ad5397e7e2ab4b261d88c5efc5d9a068302f5bb1413d2ead25bd9
MD5 84050a2dfad724e06b03305a07c5d02f
BLAKE2b-256 321daad3daeeb88c9e434b4374e3754d44607844544ba76c0da761fcd3c28a58

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page