LLM access to pplx-api 3 by Perplexity Labs
Project description
llm-perplexity
LLM access to pplx-api 3 by Perplexity Labs
Installation
Install this plugin in the same environment as LLM.
llm install llm-perplexity
Usage
First, set an API key for Perplexity AI:
llm keys set perplexity
# Paste key here
Run llm models
to list the models, and llm models --options
to include a list of their options.
Run prompts like this:
llm -m llama-3-sonar-small-32k-chat 'Fun facts about pelicans'
llm -m llama-3-sonar-small-32k-online 'Fun facts about walruses'
llm -m llama-3-sonar-large-32k-chat 'Fun facts about wolves'
llm -m llama-3-sonar-large-32k-online 'Fun facts about foxes'
llm -m llama-3-8b-instruct 'Fun facts about lemurs'
llm -m llama-3-70b-instruct 'Fun facts about coyotes'
llm -m mixtral-8x7b-instruct 'Fun facts about tigers'
Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-perplexity
python3 -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
llm install -e '.[test]'
This plugin was made after the llm-claude-3 plugin by Simon Willison.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
llm_perplexity-0.6.tar.gz
(7.7 kB
view hashes)
Built Distribution
Close
Hashes for llm_perplexity-0.6-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | bfbccd5df2e834e29783a895ecebac8413ebc94d7f81b13f99da928793da771c |
|
MD5 | 6e572063b7e8fb47c5bb6735fa357b91 |
|
BLAKE2b-256 | 7e7ce49b730fb57d5ca8cdbf0a44fc7607ce84444cbba667dffc834c577d0127 |