Skip to main content

Simple utility for running distributed kubectl exec commands

Project description

multiexec

Simple utility for running a kubectl exec across multiple Pods.

Usage

Call the script with your Pod filters (passed thru to kubectl get pods) and your command (passed thru to kubectl exec -it) as follows:

$ multiexec <POD FILTERS> -- <EXEC COMMAND>

pass --once-per-node to only run the command in a single Pod on each node

Examples

Say hello in every Pod on a given node

$ multiexec --all-namespaces --field-selector spec.nodeName=ip-1-2-3-4.ec2.internal -- /bin/bash -c "echo hello"

ip-1-2-3-4.ec2.internal
$ kubectl exec -it -n namespaceA some-pod-46vp8 -- /bin/bash -c echo hello
hello

$ kubectl exec -it -n namespaceB another-pod-fcvmq -- /bin/bash -c echo hello
hello

$ kubectl exec -it -n namespaceC foo-app-l95cj -- /bin/bash -c echo hello
hello

$ kubectl exec -it -n namespaceD bar-app-6zzb8 -- /bin/bash -c echo hello
hello

Get GPU RAM usage via nvidia-smi on each node in namespaceA

$ multiexec --once-per-node -n namespaceA -- nvidia-smi --query-compute-apps=pid,used_memory --format=csv        

ip-1-2-3-4.ec2.internal
$ kubectl exec -it -n namespaceA foo-app-1 -- nvidia-smi --query-compute-apps=pid,used_memory --format=csv
pid, used_gpu_memory [MiB]
5276, 25 MiB
4860, 2437 MiB

ip-2-3-4-5.ec2.internal
$ kubectl exec -it -n namespaceA bar-app-2 -- nvidia-smi --query-compute-apps=pid,used_memory --format=csv
pid, used_gpu_memory [MiB]
12201, 25 MiB
11509, 2539 MiB
14466, 3713 MiB

ip-3-4-5-6.ec2.internal
$ kubectl exec -it -n namespaceA foo-app-2 -- nvidia-smi --query-compute-apps=pid,used_memory --format=csv
pid, used_gpu_memory [MiB]
20570, 25 MiB
19846, 2157 MiB
14641, 2149 MiB

ip-4-5-6-7.ec2.internal
$ kubectl exec -it -n namespaceA bar-app-1 -- nvidia-smi --query-compute-apps=pid,used_memory --format=csv
pid, used_gpu_memory [MiB]
23317, 25 MiB
7236, 4501 MiB
30002, 1009 MiB

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

multiexec-0.0.4.tar.gz (3.6 kB view details)

Uploaded Source

Built Distribution

multiexec-0.0.4-py3-none-any.whl (4.1 kB view details)

Uploaded Python 3

File details

Details for the file multiexec-0.0.4.tar.gz.

File metadata

  • Download URL: multiexec-0.0.4.tar.gz
  • Upload date:
  • Size: 3.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.8.6

File hashes

Hashes for multiexec-0.0.4.tar.gz
Algorithm Hash digest
SHA256 8e48082a0d50faf4831dc8714f9a0e148ee0f2ff7edc5d153d716422a1ae5681
MD5 b8cca59f21eca2d02a6d610b58474939
BLAKE2b-256 0ac28397be4ad0a8da0e31254e4f2f1e12c766ab560559da5bc9af19a7ade302

See more details on using hashes here.

File details

Details for the file multiexec-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: multiexec-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 4.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.8.6

File hashes

Hashes for multiexec-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 1f25828eff9d1655ff979c733c8ac5211511d82acafd9a67734b5ae4293a1a78
MD5 3d0eb9ea147a317aa303064be1488f3b
BLAKE2b-256 12ed1027732f2fdbcf6e6cdb3d6084ee46b67e9e4a374b37aad48003f41cbbd6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page