LLM access to pplx-api 3 by Perplexity Labs
Project description
llm-perplexity
LLM access to pplx-api 3 by Perplexity Labs
Installation
Install this plugin in the same environment as LLM.
llm install llm-perplexity
Usage
First, set an API key for Perplexity AI:
llm keys set perplexity
# Paste key here
Run llm models
to list the models, and llm models --options
to include a list of their options.
Run prompts like this:
llm -m sonar-small-chat 'Fun facts about pelicans'
llm -m sonar-small-online 'Fun facts about walruses'
llm -m sonar-medium-chat 'Fun facts about wolves'
llm -m sonar-medium-online 'Fun facts about foxes'
llm -m codellama-70b-instruct 'Fun facts about lemurs'
llm -m mistral-7b-instruct 'Fun facts about coyotes'
llm -m mixtral-8x7b-instruct 'Fun facts about tigers'
Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-perplexity
python3 -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
llm install -e '.[test]'
This plugin was made after the llm-claude-3 plugin by Simon Willison.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
llm-perplexity-0.4.tar.gz
(7.8 kB
view hashes)
Built Distribution
Close
Hashes for llm_perplexity-0.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 070a5cdd2e7f73f44c813db27dff805888c9a1ada290aebfad44f9928088107d |
|
MD5 | d092c1fcb5ea650d6f0ecdd803986b42 |
|
BLAKE2b-256 | cf22bd0cc1087e6551304a7d0a2bb8e5ce20d7623924042037894e738702bee0 |