Skip to main content

LLM access to pplx-api 3 by Perplexity Labs

Project description

llm-perplexity

PyPI Changelog License

LLM access to pplx-api 3 by Perplexity Labs

Installation

Install this plugin in the same environment as LLM.

llm install llm-perplexity

Usage

First, set an API key for Perplexity AI:

llm keys set perplexity
# Paste key here

Run llm models to list the models, and llm models --options to include a list of their options.

Available Models

Most Perplexity models have access to real-time web information. Here are the currently available models (as of 2025-06-03) from https://docs.perplexity.ai/models/model-cards:

  • sonar-pro - Flagship model (200k context) - with web search
  • sonar - Base model (128k context) - with web search
  • sonar-deep-research - Deep research model (128k context) - with web search
  • sonar-reasoning-pro - Advanced reasoning model (128k context) - with web search
  • sonar-reasoning - Reasoning model (128k context) - with web search
  • r1-1776 - Specialized model (128k context) - no web search

Run prompts like this:

# Flagship model
llm -m sonar-pro 'Latest AI research in 2025'

# Base model
llm -m sonar 'Fun facts about walruses'

# Research and reasoning models
llm -m sonar-deep-research 'Complex research question'
llm -m sonar-reasoning-pro 'Problem solving task'
llm -m sonar-reasoning 'Logical reasoning'
llm -m r1-1776 'Fun facts about seals'

Advanced Options

The plugin supports various parameters to customize model behavior:

# Control randomness (0.0 to 2.0, higher = more random)
llm -m sonar-pro --option temperature 0.7 'Generate creative ideas'

# Nucleus sampling threshold (alternative to temperature)
llm -m sonar-pro --option top_p 0.9 'Generate varied responses'

# Token filtering (between 0 and 2048)
llm -m sonar-pro --option top_k 40 'Generate focused content'

# Limit response length
llm -m sonar-pro --option max_tokens 500 'Summarize this article'

# Return related questions
llm -m sonar-pro --option return_related_questions true 'How does quantum computing work?'

# Use Pro Search or auto classification (requires streaming)
llm -m sonar-pro --option search_type pro 'Analyze the latest developments in quantum computing'
llm -m sonar-pro --option search_type auto 'Compare the energy efficiency of popular EVs'

# Suppress citations section and discourage inline [n] markers
llm -m sonar-pro --option include_citations false 'Latest AI research in 2025'

Using Images with Perplexity

The plugin supports sending images to Perplexity models for analysis (multi-modal input):

# Analyze an image with Perplexity
llm -m sonar-pro --option image_path /path/to/your/image.jpg 'What can you tell me about this image?'

# Ask specific questions about an image
llm -m sonar-pro --option image_path /path/to/screenshot.png 'What text appears in this screenshot?'

# Multi-modal conversation with an image
llm -m sonar-pro --option image_path /path/to/diagram.png 'Explain the process shown in this diagram'

Note: Only certain Perplexity models support image inputs. Currently the following formats are supported: PNG, JPEG, and GIF.

OpenRouter Access

You can also access these models through OpenRouter. First install the OpenRouter plugin:

llm install llm-openrouter

Then set your OpenRouter API key:

llm keys set openrouter

Use the --option use_openrouter true flag to route requests through OpenRouter:

llm -m sonar-pro --option use_openrouter true 'Fun facts about pelicans'

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-perplexity
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

llm install -e '.[test]'

Running Tests

The test suite is comprehensive and tests all example commands from the documentation with actual API calls.

Before running tests, you need to set up your environment variables:

  1. Copy the .env.example file to .env:

    cp .env.example .env
    
  2. Edit the .env file and add your Perplexity API key:

    LLM_PERPLEXITY_KEY=your_perplexity_api_key_here
    
  3. (Optional) If you want to test OpenRouter integration, also add your OpenRouter API key:

    LLM_OPENROUTER_KEY=your_openrouter_api_key_here
    
  4. Install the package and test dependencies using one of these methods:

    Using the setup script:

    ./setup.sh
    

    Using make:

    make setup
    

    Manually:

    pip install -e .
    pip install pytest python-dotenv pillow
    

Run the tests with pytest:

# Run all tests
pytest test_llm_perplexity.py

# Using make
make test

# Run a specific test
pytest test_llm_perplexity.py::test_standard_models

Note: Running the full test suite will make real API calls to Perplexity, which may incur costs depending on your account plan.

This plugin was made after the llm-claude-3 plugin by Simon Willison.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_perplexity-2025.12.0.tar.gz (12.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_perplexity-2025.12.0-py3-none-any.whl (12.4 kB view details)

Uploaded Python 3

File details

Details for the file llm_perplexity-2025.12.0.tar.gz.

File metadata

  • Download URL: llm_perplexity-2025.12.0.tar.gz
  • Upload date:
  • Size: 12.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llm_perplexity-2025.12.0.tar.gz
Algorithm Hash digest
SHA256 eaa38a9ffaa3c79060a0b0abb435225d512e966b80ca56907ffa641651582d25
MD5 ac2ab5bb3b3e376a268c92f8583a04b2
BLAKE2b-256 355041c8a264e2b8b16e7a4c64cec1c25485dedeb2205488ce83fdac003f6702

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_perplexity-2025.12.0.tar.gz:

Publisher: publish.yml on hex/llm-perplexity

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llm_perplexity-2025.12.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_perplexity-2025.12.0-py3-none-any.whl
Algorithm Hash digest
SHA256 519a81e93dca53fe179828aee3a5d67335a824dc7bc0b62c39d37fb53dfa401f
MD5 00804946a41ee2661d2d857aa9764131
BLAKE2b-256 95541abf77c8195bd98ca324eefc7a2bd9e42e977436405d6363c11440581efc

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_perplexity-2025.12.0-py3-none-any.whl:

Publisher: publish.yml on hex/llm-perplexity

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page