Skip to main content

LLM plugin for Meta Llama2 on AWS Bedrock

Project description

llm-bedrock-meta

PyPI Changelog Tests License

Plugin for LLM adding support for Meta LLama 2's models in Amazon Bedrock

Installation

Install this plugin in the same environment as LLM. From the current directory

llm install llm-bedrock-meta

Configuration

You will need to specify AWS Configuration with the normal boto3 and environment variables.

For example, to use the region us-west-2 and AWS credentials under the personal profile, set the environment variables

export AWS_DEFAULT_REGION=us-west-2
export AWS_PROFILE=personal

Usage

This plugin adds model called bedrock-llama2-13b. You can also use it with the alias bl2.

You can query them like this:

llm -m bedrock-llama2-13b "Ten great names for a new space station"
llm -m bl2 "Ten great names for a new space station"

You can also chat with the model:

llm chat -m bl2

Options

  • -o max_gen_len 1024, default 2_048: The maximum number of tokens to generate before stopping.
  • -o verbose 1, default 0: Output more verbose logging.
  • -o temperature 0.8, default 0.6: Use a lower value to decrease randomness in the response.
  • top_p, default 0.9: Use a lower value to ignore less probable options. Set to 0 or 1.0 to disable.

Use like this:

llm -m bedrock-llama2-13b -o max_gen_len 20 "Sing me the alphabet"
 Here is the alphabet song:

A B C D E F G
H I J

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm-bedrock-meta-0.1.7.tar.gz (7.9 kB view hashes)

Uploaded Source

Built Distribution

llm_bedrock_meta-0.1.7-py3-none-any.whl (8.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page