Skip to main content

Python SIEM Query Utils nbdev edition

Project description

SIEM Query Utils

Install

Below is how to install in a plain python 3.11+ environment

pip install https://github.com/wagov/nbdev-squ/archive/refs/tags/v1.3.2.tar.gz

The installation can also be run in a notebook (we tend to use JupyterLab Desktop for local dev). The SQU_CONFIG env var indicates to nbdev_squ it should load the json secret squconfig-my_keyvault_tenantid from the my_kevault_name keyvault.

%pip install https://github.com/wagov/nbdev-squ/archive/refs/tags/v1.3.2.tar.gz
import os; os.environ["SQU_CONFIG"] = "{{ my_keyvault_name }}/{{ my_keyvault_tenantid }}" 

from nbdev_squ import api
# do cool notebook stuff with api

Security considerations

The contents of the keyvault secret are loaded into memory and cached in the user_cache_dir which should be a temporary secure directory restricted to the single user. Please ensure that the system this library is used on disallows access and/or logging of the user cache directory to external locations, and is on an encrypted disk (a common approach is to use isolated VMs and workstations for sensitive activities).

How to use

Note: If you create/use a Github Codespace on any of the wagov repos, SQU_CONFIG should be configured automatically.

Before using, config needs to be loaded into squ.core.cache, which can be done automatically from json in a keyvault by setting the env var SQU_CONFIG to "keyvault/tenantid".

export SQU_CONFIG="{{ keyvault }}/{{ tenantid }}"

Can be done in python before import from nbdev_squ as well:

import os; os.environ["SQU_CONFIG"] = "{{ keyvault }}/{{ tenantid }}"
from nbdev_squ import api
import io, pandas

# Load workspace info from datalake blob storage
df = api.list_workspaces(fmt="df"); print(df.shape)

# Load workspace info from introspection of azure graph
df = api.list_securityinsights(); print(df.shape)

# Kusto query to Sentinel workspaces via Azure Lighthouse
df = api.query_all("SecurityIncident | take 20", fmt="df"); print(df.shape)

# Kusto queries to Sentinel workspaces via Azure Lighthouse (batches up to 100 queries at a time)
df = api.query_all(["SecurityAlert | take 20" for a in range(10)]); print(df.shape)

# Kusto query to ADX
#df = api.adxtable2df(api.adx_query("kusto query | take 20"))

# General azure cli cmd
api.azcli(["config", "set", "extension.use_dynamic_install=yes_without_prompt"])
print(len(api.azcli(["account", "list"])))

# Various pre-configured api clients

# RunZero
response = api.clients.runzero.get("/export/org/assets.csv", params={"search": "has_public:t AND alive:t AND (protocol:rdp OR protocol:vnc OR protocol:teamviewer OR protocol:telnet OR protocol:ftp)"})
pandas.read_csv(io.StringIO(response.text)).head(10)

# Jira
pandas.json_normalize(api.clients.jira.jql("updated > -1d")["issues"]).head(10)

# AbuseIPDB
api.clients.abuseipdb.check_ip("1.1.1.1")

# TenableIO
pandas.DataFrame(api.clients.tio.scans.list()).head(10)
badips_df = api.query_all("""
SecurityIncident
| where Classification == "TruePositive"
| mv-expand AlertIds
| project tostring(AlertIds)
| join SecurityAlert on $left.AlertIds == $right.SystemAlertId
| mv-expand todynamic(Entities)
| project Entities.Address
| where isnotempty(Entities_Address)
| distinct tostring(Entities_Address)
""", timespan=pandas.Timedelta("45d"))
df = api.query_all("find where ClientIP startswith '172.16.' | evaluate bag_unpack(pack_) | take 40000")
df = api.query_all("""union withsource="_table" *
| extend _ingestion_time_bin = bin(ingestion_time(), 1h)
| summarize take_any(*) by _table, _ingestion_time_bin
| project pack=pack_all(true)""")
import json
pandas.DataFrame(list(df["pack"].apply(json.loads)))

Secrets template

The below json can be used as a template for saving your own json into my_keyvault_name/squconfig-my_keyvault_tenantid to use with this library:

{
  "config_version": "20240101 - added ??? access details",
  "datalake_blob_prefix": "https://???/???",
  "datalake_subscription": "???",
  "datalake_account": "???.blob.core.windows.net",
  "datalake_container": "???",
  "kql_baseurl": "https://raw.githubusercontent.com/???",
  "azure_dataexplorer": "https://???.???.kusto.windows.net/???",
  "tenant_id": "???",
  "jira_url": "https://???.atlassian.net",
  "jira_username": "???@???",
  "jira_password": "???",
  "runzero_apitoken": "???",
  "abuseipdb_api_key": "???",
  "tenable_access_key": "???",
  "tenable_secret_key": "???",
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nbdev-squ-1.3.2.tar.gz (506.6 kB view details)

Uploaded Source

Built Distribution

nbdev_squ-1.3.2-py3-none-any.whl (262.4 kB view details)

Uploaded Python 3

File details

Details for the file nbdev-squ-1.3.2.tar.gz.

File metadata

  • Download URL: nbdev-squ-1.3.2.tar.gz
  • Upload date:
  • Size: 506.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for nbdev-squ-1.3.2.tar.gz
Algorithm Hash digest
SHA256 7703bfcd440fb2ccce86d09758ca3dc815b8916ab9fe8f30bab21ee8758e1f4b
MD5 6dde5d7f8e58d0b49bf28b6020bd953c
BLAKE2b-256 b4c5f0fe1ba2f9d9d8cb2f8ee5dbd38948a38d4eebe5c1bfe44cc7f223343460

See more details on using hashes here.

File details

Details for the file nbdev_squ-1.3.2-py3-none-any.whl.

File metadata

  • Download URL: nbdev_squ-1.3.2-py3-none-any.whl
  • Upload date:
  • Size: 262.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for nbdev_squ-1.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 687d9f31bf89c93511a6a404d248297ce894dc7f77cd9d1b3dc253f15910d8a5
MD5 99ddfab73e49dafcc1482248b9a5fef2
BLAKE2b-256 073a052eecc7bc2bd162597e045bb6f43e6830d828a2a6f0f7f6623ad451f1b0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page