Skip to main content

Wrapper library over paramiko to allow remote execution of tasks. Supports parallel execution on multiple hosts

Project description

Library for running asynchronous parallel SSH commands over many hosts.

parallel-ssh uses asychronous network requests - there is no multi-threading or multi-processing used.

This is a requirement for commands on many (hundreds/thousands/hundreds of thousands) of hosts which would grind a system to a halt simply by having so many processes/threads all wanting to execute if done with multi-threading/processing.

https://api.travis-ci.org/pkittenis/parallel-ssh.png?branch=master https://coveralls.io/repos/pkittenis/parallel-ssh/badge.png?branch=master version downloads Latest documentation

Installation

$ pip install parallel-ssh

Usage Example

See documentation on read the docs for more complete examples.

Run ls on two remote hosts in parallel.

>>> from pssh import ParallelSSHClient
>>> hosts = ['myhost1', 'myhost2']
>>> client = ParallelSSHClient(hosts)
>>> output = client.run_command('ls -ltrh /tmp/', sudo=True)
>>> print output
{'myhost1': {'exit_code': 0, 'stdout': <generator>, 'stderr': <generator>, 'channel': <channel>, 'cmd' : <greenlet>},
 'myhost2': {'exit_code': 0, 'stdout': <generator>, 'stderr': <generator>, 'channel': <channel>, 'cmd' : <greenlet>}}

Stdout and stderr buffers are available in output. Iterating on them can be used to get output as it becomes available.

>>> for host in output:
>>>     for line in output[host]['stdout']:
>>>         print "Host %s - output: %s" % (host, line)
Host myhost1 - output: drwxr-xr-x  6 xxx xxx 4.0K Jan  1 00:00 xxx
Host myhost2 - output: drwxr-xr-x  6 xxx xxx 4.0K Jan  1 00:00 xxx

Joining on the connection pool can be used to block and wait for all parallel commands to finish if reading stdout/stderr is not required.

>>> client.pool.join()

Frequently asked questions

Q:

Why should I use this module and not, for example, fabric?

A:

Fabric is a port of Capistrano from ruby to python. Its design goals are to provide a faithful port of capistrano with its tasks and roles to python with interactive command line being the intended usage. Its use as a library is non-standard and in many cases just plain broken.

Furthermore, its parallel commands use a combination of both threads and processes with extremely high CPU usage and system load while running. Fabric currently stands at over 6,000 lines of code, majority of which is untested, particularly if used as a library as opposed to less than 700 lines of code currently in ParallelSSH with over 80% code test coverage.

ParallelSSH’s design goals and motivation are to provide a library for running asynchronous SSH commands in parallel with no load induced on the system by doing so with the intended usage being completely programmatic and non-interactive - Fabric provides none of these goals.

Q:

Are SSH agents used?

A:

All available keys in a running SSH agent in addition to SSH keys in the user’s home directory, ~/.ssh/id_dsa, ~/.ssh/id_rsa et al are automatically used by ParallelSSH.

Q:

Can ParallelSSH forward my SSH agent?

A:

SSH agent forwarding, what ssh -A does on the command line, is supported and enabled by default. Creating an object as ParallelSSHClient(forward_ssh_agent=False) will disable that behaviour.

Q:

Is tunneling/proxying supported?

A:

ParallelSSH natively supports tunelling through an intermediate SSH server. Connecting to a remote host is accomplished via an SSH tunnel using the SSH’s protocol direct TCP tunneling feature, using local port forwarding. This is done natively in python and tunnel connections are asynchronous like all other connections in the ParallelSSH library. For example, client -> proxy SSH server -> remote SSH destination.

Use the proxy_host and proxy_port parameters to configure your proxy.

>>> client = ParallelSSHClient(hosts, proxy_host='my_ssh_proxy_host')

Note that while connections from the ParallelSSH client to the tunnel host are asynchronous, connections from the tunnel host to the remote destination(s) may not be, depending on the SSH server implementation. If the SSH server uses threading to implement its tunelling and that server is used to tunnel to a large number of remote destinations system load on the tunnel server will increase linearly according to number of remote hosts.

Q:

Is there a way to programmatically provide an SSH key?

A:

Yes, use the pkey parameter of the ParallelSSHClient class. For example:

>>> import paramiko
>>> client_key = paramiko.RSAKey.from_private_key_file('user.key')
>>> client = ParallelSSHClient(['myhost1', 'myhost2'], pkey=client_key)
Q:

Is there a user’s group for feedback and discussion about ParallelSSH?

A:

There is a public ParallelSSH Google group setup for this purpose - both posting and viewing are open to the public.

SFTP/SCP

SFTP is supported (SCP version 2) natively, no scp command required.

For example to copy a local file to remote hosts in parallel

>>> from pssh import ParallelSSHClient
>>> hosts = ['myhost1', 'myhost2']
>>> client = ParallelSSHClient(hosts)
>>> client.copy_file('../test', 'test_dir/test')
>>> client.pool.join()
Copied local file ../test to remote destination myhost1:test_dir/test
Copied local file ../test to remote destination myhost2:test_dir/test

Project details


Release history Release notifications | RSS feed

This version

0.7

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

parallel-ssh-0.7.tar.gz (10.6 kB view details)

Uploaded Source

File details

Details for the file parallel-ssh-0.7.tar.gz.

File metadata

  • Download URL: parallel-ssh-0.7.tar.gz
  • Upload date:
  • Size: 10.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for parallel-ssh-0.7.tar.gz
Algorithm Hash digest
SHA256 51d2e1fe59f109bb7261bfcbb11f11699702a7abd8f2d88cf298b719fdb80afb
MD5 56d045bb3bb086728b7d264db93998a9
BLAKE2b-256 d60e6fe3f488734c712f0dd04ff83be57d3f3766b5b5b6c31758ad9a45a77e0e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page