Skip to main content

Python SIEM Query Utils nbdev edition

Project description

SIEM Query Utils

Install

GitHub Actions Workflow Status Python Packaging Index - Version OpenSSF Scorecard

Below is how to install in a plain python 3.11+ environment

pip install nbdev-squ

The installation can also be run in a notebook (we tend to use JupyterLab Desktop for local dev). The SQU_CONFIG env var indicates to nbdev_squ it should load the json secret squconfig-my_keyvault_tenantid from the my_kevault_name keyvault.

%pip install nbdev-squ
import os; os.environ["SQU_CONFIG"] = "{{ my_keyvault_name }}/{{ my_keyvault_tenantid }}" 

from nbdev_squ import api
# do cool notebook stuff with api

Security considerations

The contents of the keyvault secret are loaded into memory and cached in the user_cache_dir which should be a temporary secure directory restricted to the single user. Please ensure that the system this library is used on disallows access and/or logging of the user cache directory to external locations, and is on an encrypted disk (a common approach is to use isolated VMs and workstations for sensitive activities).

How to use

Note: If you create/use a Github Codespace on any of the wagov repos, SQU_CONFIG should be configured automatically.

Before using, config needs to be loaded into nbdev_squ.core.cache, which can be done automatically from json in a keyvault by setting the env var SQU_CONFIG to "keyvault/tenantid".

export SQU_CONFIG="{{ keyvault }}/{{ tenantid }}"

Can be done in python before import from nbdev_squ as well:

import os; os.environ["SQU_CONFIG"] = "{{ keyvault }}/{{ tenantid }}"
from nbdev_squ import api
import io, pandas

# Load workspace info from datalake blob storage
df = api.list_workspaces(fmt="df"); print(df.shape)

# Load workspace info from introspection of azure graph
df = api.list_securityinsights(); print(df.shape)

# Kusto query to Sentinel workspaces via Azure Lighthouse
df = api.query_all("SecurityIncident | take 20", fmt="df"); print(df.shape)

# Kusto queries to Sentinel workspaces via Azure Lighthouse (batches up to 100 queries at a time)
df = api.query_all(["SecurityAlert | take 20" for a in range(10)]); print(df.shape)

# Kusto query to ADX
#df = api.adxtable2df(api.adx_query("kusto query | take 20"))

# General azure cli cmd
api.azcli(["config", "set", "extension.use_dynamic_install=yes_without_prompt"])
print(len(api.azcli(["account", "list"])))

# Various pre-configured api clients

# RunZero
response = api.clients.runzero.get("/export/org/assets.csv", params={"search": "has_public:t AND alive:t AND (protocol:rdp OR protocol:vnc OR protocol:teamviewer OR protocol:telnet OR protocol:ftp)"})
pandas.read_csv(io.StringIO(response.text)).head(10)

# Jira
pandas.json_normalize(api.clients.jira.jql("updated > -1d")["issues"]).head(10)

# AbuseIPDB
api.clients.abuseipdb.check_ip("1.1.1.1")

# TenableIO
pandas.DataFrame(api.clients.tio.scans.list()).head(10)
badips_df = api.query_all("""
SecurityIncident
| where Classification == "TruePositive"
| mv-expand AlertIds
| project tostring(AlertIds)
| join SecurityAlert on $left.AlertIds == $right.SystemAlertId
| mv-expand todynamic(Entities)
| project Entities.Address
| where isnotempty(Entities_Address)
| distinct tostring(Entities_Address)
""", timespan=pandas.Timedelta("45d"))
df = api.query_all("find where ClientIP startswith '172.16.' | evaluate bag_unpack(pack_) | take 40000")
df = api.query_all("""union withsource="_table" *
| extend _ingestion_time_bin = bin(ingestion_time(), 1h)
| summarize take_any(*) by _table, _ingestion_time_bin
| project pack=pack_all(true)""")
import json
pandas.DataFrame(list(df["pack"].apply(json.loads)))

Secrets template

The below json can be used as a template for saving your own json into my_keyvault_name/squconfig-my_keyvault_tenantid to use with this library:

{
  "config_version": "20240101 - added ??? access details",
  "datalake_blob_prefix": "https://???/???",
  "datalake_subscription": "???",
  "datalake_account": "???.blob.core.windows.net",
  "datalake_container": "???",
  "kql_baseurl": "https://raw.githubusercontent.com/???",
  "azure_dataexplorer": "https://???.???.kusto.windows.net/???",
  "tenant_id": "???",
  "jira_url": "https://???.atlassian.net",
  "jira_username": "???@???",
  "jira_password": "???",
  "runzero_apitoken": "???",
  "abuseipdb_api_key": "???",
  "tenable_access_key": "???",
  "tenable_secret_key": "???",
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nbdev_squ-1.3.6.tar.gz (511.0 kB view details)

Uploaded Source

Built Distribution

nbdev_squ-1.3.6-py3-none-any.whl (267.2 kB view details)

Uploaded Python 3

File details

Details for the file nbdev_squ-1.3.6.tar.gz.

File metadata

  • Download URL: nbdev_squ-1.3.6.tar.gz
  • Upload date:
  • Size: 511.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for nbdev_squ-1.3.6.tar.gz
Algorithm Hash digest
SHA256 c56dfef4a97c301e3c73368589cf38348b198ab454e7da1da36e626f13fe8711
MD5 40b8f79f64d366c5977ad47fa672caf1
BLAKE2b-256 1595df958036a137a995faa5f3b1f7626dcb4229fbfd011c559f4a4ead991d2a

See more details on using hashes here.

File details

Details for the file nbdev_squ-1.3.6-py3-none-any.whl.

File metadata

  • Download URL: nbdev_squ-1.3.6-py3-none-any.whl
  • Upload date:
  • Size: 267.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for nbdev_squ-1.3.6-py3-none-any.whl
Algorithm Hash digest
SHA256 24afe12befd428c2e05f4669935f17089c7f3f9f20af4b9e0e7656bb1511b8c8
MD5 c7c118198f3a558f12ebb60ecc60357f
BLAKE2b-256 70f3ccb7b3ea4cec28ff7a2a91450dcce3bc705732f6f24470ea5e2d1843c759

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page