Skip to main content

Crawl through AWS accounts in an organization using master assumed role.

Project description

Overview

Crawl through AWS accounts in an organization using master assumed role. You can specify a comma-separated string of account IDs for specific accounts, an Organizational Unit ID to crawl through all accounts therein, or a comma-separated string of account statuses to crawl through matching accounts in the organization.

Crawling Precedence:

  1. Specific accounts

  2. Organizational Unit

  3. All matching accounts in the organization

Usage

Installation:

pip3 install aws_crawler
python3 -m pip install aws_crawler

Example:

import aws_crawler
import boto3
from multithreader import threads
from aws_authenticator import AWSAuthenticator as awsauth
from pprint import pprint as pp


def get_caller_identity(
   account_id: str,
   items: dict
) -> dict:
   """Get AWS STS caller identities from accounts."""
   print(f'Working on {account_id}...')

   try:
      # Get auth credential for each account.
      credentials = aws_crawler.get_credentials(
            items['session'],
            f'arn:aws:iam::{account_id}:role/{items["assumed_role_name"]}',
            items['external_id']
      )

      # Get STS caller identity.
      client = boto3.client(
            'sts',
            aws_access_key_id=credentials['aws_access_key_id'],
            aws_secret_access_key=credentials['aws_secret_access_key'],
            aws_session_token=credentials['aws_session_token'],
            region_name=items['region']
      )
      response = client.get_caller_identity()['UserId']

   except Exception as e:
      response = str(e)

   # Return result.
   return {
      'account_id': account_id,
      'details': response
   }


if __name__ == '__main__':
   # Login to AWS through SSO.
   auth = awsauth(
      sso_url='https://myorg.awsapps.com/start/#',
      sso_role_name='AWSViewOnlyAccess',
      sso_account_id='123456789012'
   )
   session = auth.sso()

   # # Create account list from comma-separated string of IDs.
   # account_ids = aws_crawler.create_account_list(
   #    '123456789012, 234567890123, 345678901234'
   # )
   # Get account list for an Organizational Unit.
   account_ids = aws_crawler.list_ou_accounts(
      session,
      'ou-abc123-asgh39'
   )
   # # Get matching account list for the entire organization.
   # account_ids = aws_crawler.list_accounts(
   #    session,
   #    'ACTIVE,SUSPENDED'
   # )

   # Execute task with multithreading.
   items = {
      'session': session,
      'assumed_role_name': 'MyOrgCrossAccountAccess',
      'external_id': 'lkasf987923ljkf2;lkjf298fj2',
      'region': 'us-east-1'
   }
   results = threads(
      get_caller_identity,
      account_ids,
      items,
      thread_num=5
   )

   # Print results.
   pp(results)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aws_crawler-1.2.8.tar.gz (4.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aws_crawler-1.2.8-py3-none-any.whl (5.5 kB view details)

Uploaded Python 3

File details

Details for the file aws_crawler-1.2.8.tar.gz.

File metadata

  • Download URL: aws_crawler-1.2.8.tar.gz
  • Upload date:
  • Size: 4.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.3 Linux/5.15.154+

File hashes

Hashes for aws_crawler-1.2.8.tar.gz
Algorithm Hash digest
SHA256 67e511f41557b4c55406346bf0a19cce080423805cbbd466e6d5d9d059b914da
MD5 4a82aef43b6816d3220996baf337dc86
BLAKE2b-256 32d8fc7177525dd92a07a0bbaac8afbe7a90e54c496bd8ee7a40bd49a5d6a0cc

See more details on using hashes here.

File details

Details for the file aws_crawler-1.2.8-py3-none-any.whl.

File metadata

  • Download URL: aws_crawler-1.2.8-py3-none-any.whl
  • Upload date:
  • Size: 5.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.3 Linux/5.15.154+

File hashes

Hashes for aws_crawler-1.2.8-py3-none-any.whl
Algorithm Hash digest
SHA256 67812a6ffd7672f5807c857587a88edbd0919b92a8f01dd9ce2adbc3d1642a9b
MD5 440c5d4c5f12f8bed2eb4676dbe9d7bf
BLAKE2b-256 fcbbaedcf710f7f1534a8929d8b7376f0c21efb7ccec57808000ce830a161cc5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page