Skip to main content

No project description provided

Project description

Bedrock Bot

This project is a basic CLI-based chat bot that uses Bedrock to resolve questions. It can take input from stdin, CLI arguments or interactively when no parameters have been passed.

Installation

  1. pip install bedrock-bot
  2. You will also need some AWS credentials available in your shell (any usual way works - CLI configured IAM user access key/secret keys, environment variables, etc)
  3. Bedrock requires you to opt in to models in order to use them

Usage

Usage: bedrock [OPTIONS] [ARGS]...

Options:
  -r, --region TEXT               The AWS region to use for requests. If no
                                  default region is specified, defaults to us-
                                  east-1
  --raw-output TEXT               Don't interpret markdown in the AI response
  -m, --model [Claude-3-Haiku|Claude-3-Sonnet|Mistral-Large]
                                  The model to use for requests
  -v, --verbose                   Enable verbose logging messages
  -i, --input-file FILENAME       Read in file(s) to be used in your queries
  --help                          Show this message and exit.

Directly as a chat bot:

$ bedrock

Hello! I am an AI assistant powered by Amazon Bedrock and using the model Claude-3-Haiku. Enter 'quit' or 'exit' at any time to exit. How may I help you today?
(You can clear existing context by starting a query with 'new>' or 'reset>')

> Hi, what is your name?
My name is Claude.

Using CLI arguments:

$ bedrock "Hi, what is your name?"

Hello! I am an AI assistant powered by Amazon Bedrock and using the model Claude-3-Haiku. Enter 'quit' or 'exit' at any time to exit. How may I help you today?
(You can clear existing context by starting a query with 'new>' or 'reset>')

> Hi, what is your name?
My name is Claude. It's nice to meet you!

Using stdin (Note that you can only use this for one-shot questions as input is reserved by your pipe to stdin and is not an interactive TTY any more):

$ echo "Hi, what is your name?" > input-file

$ cat input-file | bedrock
Hello! I am an AI assistant powered by Amazon Bedrock and using the model Claude-3-Haiku. Enter 'quit' or 'exit' at any time to exit. How may I help you today?
(You can clear existing context by starting a query with 'new>' or 'reset>')

> Hi, what is your name?

My name is Claude. I'm an AI created by Anthropic. It's nice to meet you!                                                         


Note that you can only do one-shot requests when providing input via stdin

Asking about a file:

$ bedrock --input-file bedrock_bot/models/base_model.py write unit tests using pytest for this file
Hello! I am an AI assistant powered by Amazon Bedrock and using the model Claude-3-Haiku. Enter 'quit' or 'exit' at any time to exit. How may I help you today?
(You can clear existing context by starting a query with 'new>' or 'reset>')

> write unit tests using pytest for this file
To write unit tests for the bedrock_bot/models/base_model.py file using pytest, you can create a test_base_model.py file in the tests directory. Here's an example of how you can structure the tests:


 import json
 from unittest.mock import patch, MagicMock
 import pytest
 from bedrock_bot.models.base_model import _BedrockModel, ConversationRole

 class TestBedrockModel:
     def setup_method(self):
         self.model = _BedrockModel("test-model-id")

     def test_reset(self):
         self.model.append_message(ConversationRole.USER, "Hello")
         assert len(self.model.messages) == 1
         self.model.reset()
         assert len(self.model.messages) == 0
...

Shell auto-complete

Shell auto-complete is also supported.

ZSH

  1. _BEDROCK_COMPLETE=zsh_source bedrock > ~/.bedrock-completion.zsh
  2. Add the following to your ~/.zshrc: source ~/.bedrock-completion.zsh

Bash

  1. _BEDROCK_COMPLETE=bash_source bedrock > ~/.bedrock-completion.bash
  2. Add the following to your ~/.bashrc: source ~/.bedrock-completion.bash

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bedrock_bot-1.2.13.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

bedrock_bot-1.2.13-py3-none-any.whl (9.7 kB view details)

Uploaded Python 3

File details

Details for the file bedrock_bot-1.2.13.tar.gz.

File metadata

  • Download URL: bedrock_bot-1.2.13.tar.gz
  • Upload date:
  • Size: 7.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.10.14 Linux/6.5.0-1023-azure

File hashes

Hashes for bedrock_bot-1.2.13.tar.gz
Algorithm Hash digest
SHA256 78f9368d3393f063e0e158e581425d430bf0bbe930d50206318c0d373e2e7588
MD5 b254ee4567016d36a30f837f01a2d2f2
BLAKE2b-256 fa7429a9caabd53c942d86a74cf4adf4a9809a4c0004eb9c0f0be29dcf13f061

See more details on using hashes here.

File details

Details for the file bedrock_bot-1.2.13-py3-none-any.whl.

File metadata

  • Download URL: bedrock_bot-1.2.13-py3-none-any.whl
  • Upload date:
  • Size: 9.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.10.14 Linux/6.5.0-1023-azure

File hashes

Hashes for bedrock_bot-1.2.13-py3-none-any.whl
Algorithm Hash digest
SHA256 fb66ae59a9aa761cb49c21bdcfa126b851c760af78fe50eb5c5273e0d7134014
MD5 dd57707c7180a15f5fb889cabfc5edce
BLAKE2b-256 241b1ac2adfe4ef01d76872d6f7363e0948de7ef0094660abbd1712e4ff85cd3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page