Python scraper to extract AI responses from Perplexity's web interface.
Project description
Installation
uv pip install perplexity-webui-scraper # from PyPI (stable)
uv pip install git+https://github.com/henrique-coder/perplexity-webui-scraper.git@dev # from GitHub (development)
Requirements
- Perplexity Pro/Max account
- Session token (
__Secure-next-auth.session-tokencookie from your browser)
Getting Your Session Token
You can obtain your session token in two ways:
Option 1: Automatic (CLI Tool)
The package includes a CLI tool to automatically generate and save your session token:
get-perplexity-session-token
This interactive tool will:
- Ask for your Perplexity email
- Send a verification code to your email
- Accept either a 6-digit code or magic link
- Extract and display your session token
- Optionally save it to your
.envfile
Features:
- Secure ephemeral session (cleared on exit)
- Automatic
.envfile management - Support for both OTP codes and magic links
- Clean terminal interface with status updates
Option 2: Manual (Browser)
If you prefer to extract the token manually:
- Log in at perplexity.ai
- Open DevTools (
F12) → Application/Storage → Cookies - Copy the value of
__Secure-next-auth.session-token - Store in
.env:PERPLEXITY_SESSION_TOKEN="your_token"
Quick Start
from perplexity_webui_scraper import Perplexity
client = Perplexity(session_token="YOUR_TOKEN")
conversation = client.create_conversation()
conversation.ask("What is quantum computing?")
print(conversation.answer)
# Follow-up
conversation.ask("Explain it simpler")
print(conversation.answer)
Streaming
for chunk in conversation.ask("Explain AI", stream=True):
print(chunk.answer)
With Options
from perplexity_webui_scraper import (
ConversationConfig,
Coordinates,
Models,
SourceFocus,
)
config = ConversationConfig(
model=Models.RESEARCH,
source_focus=[SourceFocus.WEB, SourceFocus.ACADEMIC],
language="en-US",
coordinates=Coordinates(latitude=40.7128, longitude=-74.0060),
)
conversation = client.create_conversation(config)
conversation.ask("Latest AI research", files=["paper.pdf"])
API
Perplexity(session_token, config?)
| Parameter | Type | Description |
|---|---|---|
session_token |
str |
Browser cookie |
config |
ClientConfig |
Timeout, TLS, etc. |
Conversation.ask(query, model?, files?, citation_mode?, stream?)
| Parameter | Type | Default | Description |
|---|---|---|---|
query |
str |
- | Question (required) |
model |
Model |
Models.BEST |
AI model |
files |
list[str | PathLike] |
None |
File paths |
citation_mode |
CitationMode |
CLEAN |
Citation format |
stream |
bool |
False |
Enable streaming |
Models
| Model | Description |
|---|---|
Models.RESEARCH |
Research - Fast and thorough for routine research |
Models.LABS |
Labs - Multi-step tasks with advanced troubleshooting |
Models.BEST |
Best - Automatically selects the most responsive model based on the query |
Models.SONAR |
Sonar - Perplexity's fast model |
Models.GPT_52 |
GPT-5.2 - OpenAI's latest model |
Models.GPT_52_THINKING |
GPT-5.2 Thinking - OpenAI's latest model with thinking |
Models.CLAUDE_45_OPUS |
Claude Opus 4.5 - Anthropic's Opus reasoning model |
Models.CLAUDE_45_OPUS_THINKING |
Claude Opus 4.5 Thinking - Anthropic's Opus reasoning model with thinking |
Models.GEMINI_3_PRO |
Gemini 3 Pro - Google's newest reasoning model |
Models.GEMINI_3_FLASH |
Gemini 3 Flash - Google's fast reasoning model |
Models.GEMINI_3_FLASH_THINKING |
Gemini 3 Flash Thinking - Google's fast reasoning model with thinking |
Models.GROK_41 |
Grok 4.1 - xAI's latest advanced model |
Models.GROK_41_THINKING |
Grok 4.1 Thinking - xAI's latest reasoning model |
Models.KIMI_K2_THINKING |
Kimi K2 Thinking - Moonshot AI's latest reasoning model |
Models.CLAUDE_45_SONNET |
Claude Sonnet 4.5 - Anthropic's newest advanced model |
Models.CLAUDE_45_SONNET_THINKING |
Claude Sonnet 4.5 Thinking - Anthropic's newest reasoning model |
CitationMode
| Mode | Output |
|---|---|
DEFAULT |
text[1] |
MARKDOWN |
text[1](url) |
CLEAN |
text (no citations) |
ConversationConfig
| Parameter | Default | Description |
|---|---|---|
model |
Models.BEST |
Default model |
citation_mode |
CLEAN |
Citation format |
save_to_library |
False |
Save to library |
search_focus |
WEB |
Search type |
source_focus |
WEB |
Source types |
time_range |
ALL |
Time filter |
language |
"en-US" |
Response language |
timezone |
None |
Timezone |
coordinates |
None |
Location (lat/lng) |
Exceptions
The library provides specific exception types for better error handling:
| Exception | Description |
|---|---|
PerplexityError |
Base exception for all library errors |
AuthenticationError |
Session token is invalid or expired (HTTP 403) |
RateLimitError |
Rate limit exceeded (HTTP 429) |
FileUploadError |
File upload failed |
FileValidationError |
File validation failed (size, type, etc.) |
ResearchClarifyingQuestionsError |
Research mode is asking clarifying questions (not supported) |
ResponseParsingError |
API response could not be parsed |
StreamingError |
Error during streaming response |
Handling Research Mode Clarifying Questions
When using Research mode (Models.RESEARCH), the API may ask clarifying questions before providing an answer. Since programmatic interaction is not supported, the library raises a ResearchClarifyingQuestionsError with the questions:
from perplexity_webui_scraper import (
Perplexity,
ResearchClarifyingQuestionsError,
)
try:
conversation.ask("Research this topic", model=Models.RESEARCH)
except ResearchClarifyingQuestionsError as error:
print("The AI needs clarification:")
for question in error.questions:
print(f" - {question}")
# Consider rephrasing your query to be more specific
MCP Server (Model Context Protocol)
The library includes an MCP server that allows AI assistants (like Claude) to search using Perplexity AI directly.
Installation
uv pip install perplexity-webui-scraper[mcp]
Running the Server
# Set your session token
export PERPLEXITY_SESSION_TOKEN="your_token_here" # For Linux/Mac
set PERPLEXITY_SESSION_TOKEN="your_token_here" # For Windows
# Run with FastMCP
uv run fastmcp run src/perplexity_webui_scraper/mcp/server.py
# Or test with the dev inspector
uv run fastmcp dev src/perplexity_webui_scraper/mcp/server.py
Claude Desktop Configuration
Add to ~/.config/claude/claude_desktop_config.json:
{
"mcpServers": {
"perplexity": {
"command": "uv",
"args": [
"run",
"fastmcp",
"run",
"path/to/perplexity_webui_scraper/mcp/server.py"
],
"env": {
"PERPLEXITY_SESSION_TOKEN": "your_token_here"
}
}
}
}
Available Tool
| Tool | Description |
|---|---|
perplexity_ask |
Ask questions and get AI-generated answers with real-time data from the web |
Parameters:
| Parameter | Type | Default | Description |
|---|---|---|---|
query |
str |
- | Question to ask (required) |
model |
str |
"best" |
AI model (best, research, gpt52, claude_sonnet, etc.) |
source_focus |
str |
"web" |
Source type (web, academic, social, finance, all) |
Disclaimer
This is an unofficial library. It uses internal APIs that may change without notice. Use at your own risk.
By using this library, you agree to Perplexity AI's Terms of Service.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file perplexity_webui_scraper-0.3.5.tar.gz.
File metadata
- Download URL: perplexity_webui_scraper-0.3.5.tar.gz
- Upload date:
- Size: 26.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.18 {"installer":{"name":"uv","version":"0.9.18","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c9021a90fad9dbdfd96aa6245d5ffc094a4f5892607893b03b761e41e0033ad2
|
|
| MD5 |
f11b862d7ef3b57e856c9c13ff762693
|
|
| BLAKE2b-256 |
891294ea6e769c4a586085c4a6aa029ad3ae864ae5b3f50a42d63eeb7e0a2c67
|
File details
Details for the file perplexity_webui_scraper-0.3.5-py3-none-any.whl.
File metadata
- Download URL: perplexity_webui_scraper-0.3.5-py3-none-any.whl
- Upload date:
- Size: 33.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.18 {"installer":{"name":"uv","version":"0.9.18","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4b23a6a035fb916d366624b4e24c230ddb5d8ebcb1b6360fd615422c3a328a0f
|
|
| MD5 |
1c66a93853d1aa7d0c90fc0654801a10
|
|
| BLAKE2b-256 |
24d8a9920c7a9a4291f7001ac33ecab9e90341c08be4c453fa16fde31ce7146f
|