Crawl through AWS accounts in an organization using master assumed role.
Project description
Overview
Crawl through AWS accounts in an organization using master assumed role. You can specify a comma-separated string of account IDs for specific accounts, an Organizational Unit ID to crawl through all accounts therein, or a comma-separated string of account statuses to crawl through matching accounts in the organization.
Crawling Precedence:
Specific accounts
Organizational Unit
All matching accounts in the organization
Usage
Installation:
pip3 install aws_crawler
python3 -m pip install aws_crawler
Example:
Get STS caller identities
- Also featuring (installed with aws_crawler):
import aws_crawler
import boto3
from multithreader import threads
from aws_authenticator import AWSAuthenticator as awsauth
from pprint import pprint as pp
def get_caller_identity(
account_id: str,
items: dict
) -> dict:
"""Get AWS STS caller identities from accounts."""
print(f'Working on {account_id}...')
try:
# Get auth credential for each account.
credentials = aws_crawler.get_credentials(
items['session'],
f'arn:aws:iam::{account_id}:role/{items["assumed_role_name"]}',
items['external_id']
)
# Get STS caller identity.
client = boto3.client(
'sts',
aws_access_key_id=credentials['aws_access_key_id'],
aws_secret_access_key=credentials['aws_secret_access_key'],
aws_session_token=credentials['aws_session_token'],
region_name=items['region']
)
response = client.get_caller_identity()['UserId']
except Exception as e:
response = str(e)
# Return result.
return {
'account_id': account_id,
'details': response
}
if __name__ == '__main__':
# Login to AWS through SSO.
auth = awsauth(
sso_url='https://myorg.awsapps.com/start/#',
sso_role_name='AWSViewOnlyAccess',
sso_account_id='123456789012'
)
session = auth.sso()
# # Create account list from comma-separated string of IDs.
# account_ids = aws_crawler.create_account_list(
# '123456789012, 234567890123, 345678901234'
# )
# Get account list for an Organizational Unit.
account_ids = aws_crawler.list_ou_accounts(
session,
'ou-abc123-asgh39'
)
# # Get matching account list for the entire organization.
# account_ids = aws_crawler.list_accounts(
# session,
# 'ACTIVE,SUSPENDED'
# )
# Execute task with multithreading.
items = {
'session': session,
'assumed_role_name': 'MyOrgCrossAccountAccess',
'external_id': 'lkasf987923ljkf2;lkjf298fj2',
'region': 'us-east-1'
}
results = threads(
get_caller_identity,
account_ids,
items,
thread_num=5
)
# Print results.
pp(results)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file aws_crawler-1.2.6.tar.gz
.
File metadata
- Download URL: aws_crawler-1.2.6.tar.gz
- Upload date:
- Size: 3.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.12.3 Linux/5.15.154+
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f32a1499b31cd0e2ff9ad316f173f8fac2e00aa0bf86a20af194447401076df5 |
|
MD5 | 6d70c9ebe04f37bacc94fcf15b9eb5fd |
|
BLAKE2b-256 | 9c9d260d112671309970b7bfb03d2eb6bc3669dedcf94cbbdabe2f2a643c2a1a |
File details
Details for the file aws_crawler-1.2.6-py3-none-any.whl
.
File metadata
- Download URL: aws_crawler-1.2.6-py3-none-any.whl
- Upload date:
- Size: 4.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.12.3 Linux/5.15.154+
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 772a577ef569d424c4b06f0695fb18300077b850da0173ff0944e0054755725d |
|
MD5 | f83916b61651a502dc1b5e70a7e98dc7 |
|
BLAKE2b-256 | 69a3685d1678d7eb4d04ebd6a83370000ee988003b72c1431a7d47b12ca23a70 |