Skip to main content

Converse with your favorite Amazon Bedrock LLM from the command line.

Project description

Ask Amazon Bedrock

Converse with your favorite Amazon Bedrock large language model from the command line.

This tool is a wrapper around the low-level Amazon Bedrock APIs and Langchain. Its main added value is that it locally persists AWS account and model configuration to enable quick and easy interaction.

Installation

⚠️ Requires Python >= 3.9

⚠️ Requires a working AWS CLI setup configured with a profile that allows Amazon Bedrock access. See CLI documentation for details.

pip install ask-bedrock

You can also build/run this project locally, see Building and Running Locally.

Usage

Activating models

Before you can use this command line tool, you need to request model access through the AWS Console in a region where Bedrock is available: Switch to the region where you want to run Bedrock, go to ”Model access“, click “Edit”, activate the models you wish to use, and then click “Save changes”.

Invocation

To start a conversation, simply enter the following command:

ask-bedrock converse

If you don't need a conversation, you can get a simple request-response using:

ask-bedrock prompt "What's up?"

Upon the first run, you will be led through a configuration flow. To learn more about configuration options, see the Configuration section below.

If you’re fully configured, the tool will show you a >>> prompt and you can start interacting with the configured model.

Multi-line prompts can be wrapped into <<< >>> blocks.

To end your interaction, hit Ctrl + D. Note that the conversation will be lost.

You can also use a single prompt with a simple request-response:

ask-bedrock prompt "complete this sentence: One small step for me"

Pricing

Note that using Ask Amazon Bedrock incurs AWS fees. For more information, see Amazon Bedrock pricing. Consider using a dedicated AWS account and AWS Budgets to control costs.

Configuration

Ask Amazon Bedrock stores your user configuration in $HOME/.config/ask-bedrock/config.yaml. This file may contain several sets of configuration (contexts). For instance, you can use contexts to switch between different models. Use the --context parameter to select the context you'd like to use. The default context is default.

If no configuration is found for a selected context, a new one is created. If you want to change an existing config, use

ask-bedrock configure --context mycontext

You can also create or edit the configuration file yourself in $HOME/.config/ask-bedrock/config.yaml:

contexts:
  default:
    region: ""                  # an AWS region where you have activated Bedrock
    aws_profile: ""             # a profile from your ~/.aws/config file
    model_id: ""                # a Bedrock model, e.g. "ai21.j2-ultra-v1"
    model_params: "{}"          # a JSON object with parameters for the selected model

Model parameters

This JSON is passed to Langchain during client setup (as model_kwargs). The schema depends on the model that is used. Have a look at the examples.

If you want to configure multiple lines, model parameters can be wrapped in <<< >>>.

Building and Running Locally

pip install build
python -m build
python ask_bedrock/main.py converse

Feedback

As this tool is still early stage, we are very interested in hearing about your experience. Please take one minute to take a little survey: https://pulse.aws/survey/GTRWNHT1

Troubleshooting

Q: The model responses are cut off mid-sentence.

A: Configure the model to allow for longer response. Use model parameters (see above) for this. Claude for example would take the following model parameters: {"max_tokens_to_sample": 3000}


Q: I'm getting an error that is not listed here.

A: Use the --debug option to find out more about the error. If you cannot solve it, create an issue.

Security

See CONTRIBUTING for more information.

License

This project is licensed under the Apache-2.0 License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ask_bedrock-0.2.0.tar.gz (10.1 kB view details)

Uploaded Source

Built Distribution

ask_bedrock-0.2.0-py3-none-any.whl (10.9 kB view details)

Uploaded Python 3

File details

Details for the file ask_bedrock-0.2.0.tar.gz.

File metadata

  • Download URL: ask_bedrock-0.2.0.tar.gz
  • Upload date:
  • Size: 10.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for ask_bedrock-0.2.0.tar.gz
Algorithm Hash digest
SHA256 a2dbdea6ccc1bebb15de0f7ace92a21fd85c357e2711f20d3b4ba3d9764a50ac
MD5 5732385d6b6c4f04fb34c57bc407ed30
BLAKE2b-256 b63d5492ebf5632e1b52c8e094e657fec60ff5283dd0c33df4c8741bfaa36848

See more details on using hashes here.

File details

Details for the file ask_bedrock-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: ask_bedrock-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 10.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for ask_bedrock-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 10776ea1641b3bf065eb6450d48ca2cbf237fc9f3f5f5b8a367246af1a519b5e
MD5 ca786951e3482a03a34edffcc1d3ae0d
BLAKE2b-256 52599eab3f9374bddb6270cf6294e4f650c636bec323064de6c6ce0aac9c0df2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page