Converse with your favorite Amazon Bedrock LLM from the command line.
Project description
Ask Amazon Bedrock
Converse with your favorite Amazon Bedrock large language model from the command line.
This tool is a wrapper around the low-level Amazon Bedrock APIs. Its main added value is that it locally persists AWS account and model configuration to enable quick and easy interaction.
Installation
⚠️ Requires Python >= 3.9
⚠️ Requires a working AWS CLI setup configured with a profile that allows Amazon Bedrock access. See CLI documentation for details.
pip install ask-bedrock
Please refer to the troubleshooting section below if this does not work.
You can also build/run this project locally, see Building and Running Locally.
Usage
Activating models
Before you can use this command line tool, you need to request model access through the AWS Console in a region where Bedrock is available: Switch to the region where you want to run Bedrock, go to ”Model access“, click “Edit”, activate the models you wish to use, and then click “Save changes”.
Invocation
To start a conversation, simply enter the following command:
ask-bedrock converse
If you don't need a conversation, you can get a simple request-response using:
ask-bedrock prompt "What's up?"
Upon the first run, you will be led through a configuration flow. To learn more about configuration options, see the Configuration section below.
If you’re fully configured, the tool will show you a >>> prompt and you can start interacting with the configured model.
Multi-line prompts can be wrapped into <<< >>> blocks.
To end your interaction, hit Ctrl + D. Note that the conversation will be lost.
You can also use a single prompt with a simple request-response:
ask-bedrock prompt "complete this sentence: One small step for me"
MCP
Ask Amazon Bedrock supports the Model Context Protocol. You can register MCP servers through configuration, which auto-discovers resources and tools from an MCP server. The resources and tools are then available during invocation.
Pricing
Note that using Ask Amazon Bedrock incurs AWS fees. For more information, see Amazon Bedrock pricing. Consider using a dedicated AWS account and AWS Budgets to control costs.
Configuration
Ask Amazon Bedrock stores your user configuration in $HOME/.config/ask-bedrock/config.yaml. This file may contain several sets of configuration (contexts). For instance, you can use contexts to switch between different models. Use the --context parameter to select the context you'd like to use. The default context is default.
If no configuration is found for a selected context, a new one is created. If you want to change an existing config, use
ask-bedrock configure --context mycontext
You can also create or edit the configuration file yourself in $HOME/.config/ask-bedrock/config.yaml. Note that MCP configuration is verbose, but mostly auto-discovered during ask-bedrock configure:
contexts:
default:
region: "" # an AWS region where you have activated Bedrock
aws_profile: "" # a profile from your ~/.aws/config file
model_id: "" # a Bedrock model, e.g. "ai21.j2-ultra-v1"
inference_config: "{}" # a JSON object with inference configuration
mcp_servers:
- command: npx
args:
- -y
- '@modelcontextprotocol/server-filesystem'
- /Users/uhinze/Downloads
env: {}
name: file
resources: []
tools:
- description: Read the complete contents of a file from the file system. Handles various text encodings and provides detailed error messages if the file cannot be read. Use this tool when you need to examine the contents of a single file. Only works within allowed directories.
inputSchema:
$schema: http://json-schema.org/draft-07/schema#
additionalProperties: false
properties:
path:
type: string
required:
- path
type: object
name: read_file
server_name: file
- <...more tools>
Inference Configuration
The inference_config is passed directly to the Amazon Bedrock Runtime converse_stream API. This configuration controls the behavior of model generation, including parameters like temperature and token limits.
Common parameters include:
temperature(float): Controls randomness in response generation. Lower values make responses more deterministic.topP(float): Controls diversity of responses by considering tokens with top cumulative probability.maxTokens(integer): Maximum number of tokens to generate in the response.stopSequences(array): Sequences where the model should stop generating.
Example configurations:
{
"temperature": 0.7,
"topP": 0.9,
"maxTokens": 3000
}
{
"temperature": 0.5,
"maxTokens": 500,
"stopSequences": ["\n\n"]
}
For more details, see the Amazon Bedrock Runtime InferenceConfiguration API Reference.
Building and Running Locally
pip install build
python -m build
pip install -e .
ask_bedrock converse
Troubleshooting
Q: pip does not install ask-bedrock properly, instead it gives me an error like This environment is externally managed.
A: ask-bedrock needs to be installed globally to be available in your command line. Please follow the guidance in the error message by first installing pipx and then installing ask-bedrock using pipx.
Q: The model responses are cut off mid-sentence.
A: Configure the model to allow for longer response by increasing the maxTokens value in the inference configuration (see above). For example: {"maxTokens": 3000}
Q: I'm getting an error that is not listed here.
A: Use the --debug option to find out more about the error. If you cannot solve it, create an issue.
Security
See CONTRIBUTING for more information.
License
This project is licensed under the Apache-2.0 License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ask_bedrock-0.3.2.tar.gz.
File metadata
- Download URL: ask_bedrock-0.3.2.tar.gz
- Upload date:
- Size: 17.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c60c382d403da0671c53f5617f79f855543bb8d3ae9fd487e7b23dd61880e0ba
|
|
| MD5 |
08106770b497d1971e511a9aa6b5ca2f
|
|
| BLAKE2b-256 |
5c86203c03fe2e817c747ad94d3fe3e6b2436d3649f2b8d283c43cc69e2b6f8f
|
Provenance
The following attestation bundles were made for ask_bedrock-0.3.2.tar.gz:
Publisher:
publish.yaml on awslabs/ask-bedrock
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ask_bedrock-0.3.2.tar.gz -
Subject digest:
c60c382d403da0671c53f5617f79f855543bb8d3ae9fd487e7b23dd61880e0ba - Sigstore transparency entry: 237059795
- Sigstore integration time:
-
Permalink:
awslabs/ask-bedrock@d71137d561775cc64eb0b057044f449393a4adfb -
Branch / Tag:
refs/tags/v0.3.2 - Owner: https://github.com/awslabs
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yaml@d71137d561775cc64eb0b057044f449393a4adfb -
Trigger Event:
push
-
Statement type:
File details
Details for the file ask_bedrock-0.3.2-py3-none-any.whl.
File metadata
- Download URL: ask_bedrock-0.3.2-py3-none-any.whl
- Upload date:
- Size: 15.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c76cb286ca4b9b51db53bc3ce8e468405fabb4898d9394890b33676ec2819140
|
|
| MD5 |
31e33c72e4ee20919a4527337056689c
|
|
| BLAKE2b-256 |
c9dbabaad8d634558e427b2bcacf321f84b9bb0ec68efb69be64b7889bb24d17
|
Provenance
The following attestation bundles were made for ask_bedrock-0.3.2-py3-none-any.whl:
Publisher:
publish.yaml on awslabs/ask-bedrock
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ask_bedrock-0.3.2-py3-none-any.whl -
Subject digest:
c76cb286ca4b9b51db53bc3ce8e468405fabb4898d9394890b33676ec2819140 - Sigstore transparency entry: 237059797
- Sigstore integration time:
-
Permalink:
awslabs/ask-bedrock@d71137d561775cc64eb0b057044f449393a4adfb -
Branch / Tag:
refs/tags/v0.3.2 - Owner: https://github.com/awslabs
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yaml@d71137d561775cc64eb0b057044f449393a4adfb -
Trigger Event:
push
-
Statement type: