LLM plugin for Meta Llama2 on AWS Bedrock
Project description
llm-bedrock-meta
Plugin for LLM adding support for Meta LLama 2's models in Amazon Bedrock
Installation
Install this plugin in the same environment as LLM. From the current directory
llm install llm-bedrock-meta
Configuration
You will need to specify AWS Configuration with the normal boto3 and environment variables.
For example, to use the region us-west-2
and AWS credentials under the personal
profile, set the environment variables
export AWS_DEFAULT_REGION=us-west-2
export AWS_PROFILE=personal
Usage
This plugin adds model called bedrock-llama2-13b
.
You can query them like this:
llm -m bedrock-llama2-13b "Ten great names for a new space station"
Options
-o max_gen_len 1024
, default 2_048: The maximum number of tokens to generate before stopping.-o verbose 1
, default 0: Output more verbose logging.-o temperature 0.8
, default 0.6: Use a lower value to decrease randomness in the response.top_p
, default 0.9: Use a lower value to ignore less probable options. Set to 0 or 1.0 to disable.
Use like this:
llm -m bedrock-llama2-13b -o max_gen_len 20 "Sing me the alphabet"
Here is the alphabet song:
A B C D E F G
H I J
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for llm_bedrock_meta-0.1.1a0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 22b489bba54c39f2928b1594d6dd21cd3694c3708fa064dfa895a4f840274a6d |
|
MD5 | 197e875dea3e561364a019608d5e3d71 |
|
BLAKE2b-256 | c6d78a271f9857db3b41df4fa850a24a946110e23afb0212e528e8ca06a01b9f |