Login into your AWS and call Bedrock models in Python
Project description
bedrock-inference
Keywords: generative AI, Amazon AWS, Bedrock, API, wrapper, LLM, inference, NLP.
Bedrock Inference is a simple python package to handle 2FA and on-demand calls to AWS Bedrock foundation models.
Requirements
- An AWS IAM account
- An Access key ID (es: AKIAIOSFODNN7EXAMPLE) and its correspondent secret access key (es: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY) for API requests
- These credentials can be created through Home --> Security credentials --> Access Keys --> Create access key
- If you have AWS CLI installed and set up on your machine, chances are your credentials are already stored at ~/.aws/credentials and/or in the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables. In this case, the package will retrieve them automatically.
- Note: AWS CLI is not required to use bedrock-inference.
- A Multi-factor authentication device ARN (es: arn:aws:iam::012345678910:mfa/example_username)
- The ARN of your MFA device can be located at Home --> Security credentials --> Multi-factor authentication
- The 6 digits temp token generated by the MDA device (es: 366 712)
Installation
- Make sure you have the latest version of pip installed
pip install --upgrade pip
- Install araucanaxai through pip
pip install bedrock_inference
Usage
import from bedrock_inference import bedrock
# Authenticates through MFA and establish a session
session = bedrock.aws_login_mfa(arn="MFA_device_arn", token="MFA_token", aws_access_key_id="your_ID", aws_secret_access_key="your_key")
#session = bedrock.aws_login_mfa(arn="MFA_device_arn", token="MFA_token") #use this if you already have AWS credentials set up for API requests on your machine
# Instantiate youre Bedrock caller
caller = bedrock.Bedrock(session=session, region="your_aws_region")
#Check available models in your region
print(caller.list_models())
#Invoke the target AWS model and retrieve the generated answer
answer = caller.invoke_model(prompt="What's the conceptual opposite of 'Hello World'?", model_id="target_aws_model_id")
print(answer)
Contacts and Useful Links
-
Project Link: https://github.com/detsutut/bedrock-inference
-
Package Link: https://pypi.org/project/bedrock-inference/
License
Distributed under MIT License. See LICENSE
for more information.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file bedrock_inference-0.0.13.tar.gz
.
File metadata
- Download URL: bedrock_inference-0.0.13.tar.gz
- Upload date:
- Size: 7.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c85ac16b731d8a1eb723ace3962e6d8587b3bdc5a8aa8d9c35e24b9687eedd99 |
|
MD5 | 804a0fccfe42122a583f2264dbc01f73 |
|
BLAKE2b-256 | 7169dd62848860b19ce7a5be1dff12c8caad2296319674a0e2d65831fe672fa3 |
File details
Details for the file bedrock_inference-0.0.13-py3-none-any.whl
.
File metadata
- Download URL: bedrock_inference-0.0.13-py3-none-any.whl
- Upload date:
- Size: 7.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2b4a7dd09174f2e2b733dea799743e68d50523a1cf1b88abe05744c95df96bb2 |
|
MD5 | 80c05c7d26779eebb0f4cf4fb5f67b2d |
|
BLAKE2b-256 | 3e11d65b476b3b20eceb0df013d15bb6232e480a4d680e4f9ae1afcb165eec12 |