Skip to main content

Python SIEM Query Utils nbdev edition

Project description

SIEM Query Utils

Install

GitHub Actions Workflow Status Python Packaging Index - Version OpenSSF Scorecard

Below is how to install in a plain python 3.12+ environment

pip install nbdev-squ

The installation can also be run in a notebook (we tend to use JupyterLab Desktop for local dev). The SQU_CONFIG env var indicates to nbdev_squ it should load the json secret squconfig-my_keyvault_tenantid from the my_kevault_name keyvault.

%pip install nbdev-squ
import os; os.environ["SQU_CONFIG"] = "{{ my_keyvault_name }}/{{ my_keyvault_tenantid }}" 

from nbdev_squ import api
# do cool notebook stuff with api

Security considerations

The contents of the keyvault secret are loaded into memory and cached in the user_cache_dir which should be a temporary secure directory restricted to the single user. Please ensure that the system this library is used on disallows access and/or logging of the user cache directory to external locations, and is on an encrypted disk (a common approach is to use isolated VMs and workstations for sensitive activities).

How to use

Note: If you create/use a Github Codespace on any of the wagov repos, SQU_CONFIG should be configured automatically.

Before using, config needs to be loaded into nbdev_squ.core.cache, which can be done automatically from json in a keyvault by setting the env var SQU_CONFIG to "keyvault/tenantid".

export SQU_CONFIG="{{ keyvault }}/{{ tenantid }}"

Can be done in python before import from nbdev_squ as well:

import os; os.environ["SQU_CONFIG"] = "{{ keyvault }}/{{ tenantid }}"
from nbdev_squ import api
import io, pandas

# Load workspace info from datalake blob storage
df = api.list_workspaces(fmt="df"); print(df.shape)

# Load workspace info from introspection of azure graph
df = api.list_securityinsights(); print(df.shape)

# Kusto query to Sentinel workspaces via Azure Lighthouse
df = api.query_all("SecurityIncident | take 20", fmt="df"); print(df.shape)

# Kusto queries to Sentinel workspaces via Azure Lighthouse (batches up to 100 queries at a time)
df = api.query_all(["SecurityAlert | take 20" for a in range(10)]); print(df.shape)

# Kusto query to ADX
#df = api.adxtable2df(api.adx_query("kusto query | take 20"))

# General azure cli cmd
api.azcli(["config", "set", "extension.use_dynamic_install=yes_without_prompt"])
print(len(api.azcli(["account", "list"])))

# Various pre-configured api clients

# RunZero
response = api.clients.runzero.get("/export/org/assets.csv", params={"search": "has_public:t AND alive:t AND (protocol:rdp OR protocol:vnc OR protocol:teamviewer OR protocol:telnet OR protocol:ftp)"})
pandas.read_csv(io.StringIO(response.text)).head(10)

# Jira
pandas.json_normalize(api.clients.jira.jql("updated > -1d")["issues"]).head(10)

# AbuseIPDB
api.clients.abuseipdb.check_ip("1.1.1.1")

# TenableIO
pandas.DataFrame(api.clients.tio.scans.list()).head(10)
badips_df = api.query_all("""
SecurityIncident
| where Classification == "TruePositive"
| mv-expand AlertIds
| project tostring(AlertIds)
| join SecurityAlert on $left.AlertIds == $right.SystemAlertId
| mv-expand todynamic(Entities)
| project Entities.Address
| where isnotempty(Entities_Address)
| distinct tostring(Entities_Address)
""", timespan=pandas.Timedelta("45d"))
df = api.query_all("find where ClientIP startswith '172.16.' | evaluate bag_unpack(pack_) | take 40000")
df = api.query_all("""union withsource="_table" *
| extend _ingestion_time_bin = bin(ingestion_time(), 1h)
| summarize take_any(*) by _table, _ingestion_time_bin
| project pack=pack_all(true)""")
import json
pandas.DataFrame(list(df["pack"].apply(json.loads)))

Secrets template

The below json can be used as a template for saving your own json into my_keyvault_name/squconfig-my_keyvault_tenantid to use with this library:

{
  "config_version": "20240101 - added ??? access details",
  "datalake_blob_prefix": "https://???/???",
  "datalake_subscription": "???",
  "datalake_account": "???.blob.core.windows.net",
  "datalake_container": "???",
  "kql_baseurl": "https://raw.githubusercontent.com/???",
  "azure_dataexplorer": "https://???.???.kusto.windows.net/???",
  "tenant_id": "???",
  "jira_url": "https://???.atlassian.net",
  "jira_username": "???@???",
  "jira_password": "???",
  "runzero_apitoken": "???",
  "abuseipdb_api_key": "???",
  "tenable_access_key": "???",
  "tenable_secret_key": "???",
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nbdev_squ-1.3.7.tar.gz (337.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nbdev_squ-1.3.7-py3-none-any.whl (338.0 kB view details)

Uploaded Python 3

File details

Details for the file nbdev_squ-1.3.7.tar.gz.

File metadata

  • Download URL: nbdev_squ-1.3.7.tar.gz
  • Upload date:
  • Size: 337.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for nbdev_squ-1.3.7.tar.gz
Algorithm Hash digest
SHA256 fa2a61485e5518af36e2671971ff422717659ff17b35bd93d5d5d5abf25bf09f
MD5 8bd70e80a1df5da0bf902756be3d2310
BLAKE2b-256 422d6121c6ac7d7f458dde3a0d88a6e6db0db83a67571d81eeb92bae1b797ffb

See more details on using hashes here.

File details

Details for the file nbdev_squ-1.3.7-py3-none-any.whl.

File metadata

  • Download URL: nbdev_squ-1.3.7-py3-none-any.whl
  • Upload date:
  • Size: 338.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for nbdev_squ-1.3.7-py3-none-any.whl
Algorithm Hash digest
SHA256 63318ac9cb0ad5e531a07eeecfaced52bd407fb874d6c688c7dcf29ee2d9ea5f
MD5 f245c92b8fd0b1153afbae2bf8de9b87
BLAKE2b-256 64c97e678980f1523efa3a599b79216c408e59e1cccae7f9e89a476c8518362b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page