A Python client for the Openperplex API
Project description
Openperplex Python Library Documentation
The Openperplex Python library provides an interface to interact with the Openperplex API, allowing you to perform various search and web-related operations.
Installation
To install the Openperplex library, use pip:
pip install --upgrade openperplex
Initialization
To use the Openperplex library, you need to initialize it with your API key:
from openperplex import OpenperplexSync, OpenperplexAsync
api_key = "your_openperplex_api_key_here"
client_sync = OpenperplexSync(api_key)
client_async = OpenperplexAsync(api_key)
Available Methods
The library provides both synchronous and asynchronous versions of its methods. Here are the available methods:
1. search / search_stream
Perform a search query, either as a single response or as a stream.
- Query Precision: Provide clear and concise queries to get accurate results.
Synchronous:
# Non-streaming search
result = client_sync.search(
query="What are the latest developments in AI?",
date_context="Today is Tuesday 19 of November 2024 and the time is 9:40 PM",
location="us", # can be 'us', 'ca', 'uk',.... (see supported locations below)
model="o3-mini-medium", # can be 'o3-mini-medium', 'o3-mini-high', 'gpt-4o', 'gpt-4o-mini'
response_language="en", # can be 'auto', 'en', 'fr', 'es', 'de', 'it', 'pt', 'nl', 'ja', 'ko', 'zh', 'ar', 'ru', 'tr', 'hi'
answer_type="text", # can be 'text', 'markdown', or 'html'
search_type="general", # can be 'news' or 'general'
return_citations=False, # set to True to return citations
return_sources=False, # set to True to return sources
return_images=False, #set to True to return images (depends on the query, some queries may not return images)
recency_filter="anytime" # can be 'hour', 'day', 'week', 'month', 'year', 'anytime'
)
print(result)
# Streaming search
for chunk in client_sync.search_stream(
query="Explain quantum computing",
date_context="Today is Tuesday 19 of November 2024 and the time is 9:40 PM",
location="us", # can be 'us', 'ca', 'uk',.... (see supported locations below)
response_language="en", # can be 'auto', 'en', 'fr', 'es', 'de', 'it', 'pt', 'nl', 'ja', 'ko', 'zh', 'ar', 'ru', 'tr', 'hi'
answer_type="text", # can be 'text', 'markdown', or 'html'
model="o3-mini-medium", # can be 'o3-mini-medium', 'o3-mini-high', 'gpt-4o', 'gpt-4o-mini'
search_type="general", # can be 'news' or 'general'
return_citations=False, # set to True to return citations
return_sources=False, # set to True to return sources
return_images=False, #set to True to return images (depends on the query, some queries may not return images)
recency_filter="anytime" # can be 'hour', 'day', 'week', 'month', 'year', 'anytime'
):
print(chunk)
Asynchronous:
import asyncio
# Non-streaming search
async def search_async():
result = await client_async.search(
query="What are the latest developments in AI?",
date_context="Today is Tuesday 19 of November 2024 and the time is 9:40 PM",
location="us", # can be 'us', 'ca', 'uk',.... (see supported locations below)
response_language="en", # can be 'auto', 'en', 'fr', 'es', 'de', 'it', 'pt', 'nl', 'ja', 'ko', 'zh', 'ar', 'ru', 'tr', 'hi'
answer_type="text", # can be 'text', 'markdown', or 'html'
model="o3-mini-medium", # can be 'o3-mini-medium', 'o3-mini-high', 'gpt-4o', 'gpt-4o-mini'
search_type="general", # can be 'news' or 'general'
return_citations=False, # set to True to return citations
return_sources=False, # set to True to return sources
return_images=False, #set to True to return images (depends on the query, some queries may not return images)
recency_filter="anytime" # can be 'hour', 'day', 'week', 'month', 'year', 'anytime'
)
print(result)
# Streaming search
async for chunk in client_async.search_stream(
query="Explain quantum computing",
date_context="Today is Tuesday 19 of November 2024 and the time is 9:40 PM",
location="us", # can be 'us', 'ca', 'uk',.... (see supported locations below)
response_language="en", # can be 'auto', 'en', 'fr', 'es', 'de', 'it', 'pt', 'nl', 'ja', 'ko', 'zh', 'ar', 'ru', 'tr', 'hi'
answer_type="text", # can be 'text', 'markdown', or 'html'
model="o3-mini-medium", # can be 'o3-mini-medium', 'o3-mini-high', 'gpt-4o', 'gpt-4o-mini'
search_type="general", # can be 'news' or 'general'
return_citations=False, # set to True to return citations
return_sources=False, # set to True to return sources
return_images=False, #set to True to return images
recency_filter="anytime" # can be 'hour', 'day', 'week', 'month', 'year', 'anytime'
):
print(chunk)
asyncio.run(search_async())
2. get_website_text
Retrieve the text content of a website.
Synchronous:
result = client_sync.get_website_text("https://www.example.com")
print(result)
Asynchronous:
result = await client_async.get_website_text("https://www.example.com")
print(result)
3. get_website_screenshot
Get a screenshot of a website.
Synchronous:
result = client_sync.get_website_screenshot("https://www.example.com")
print(result)
Asynchronous:
result = await client_async.get_website_screenshot("https://www.example.com")
print(result)
4. get_website_markdown
Get the markdown representation of a website.
Synchronous:
result = client_sync.get_website_markdown("https://www.example.com")
print(result)
Asynchronous:
result = await client_async.get_website_markdown("https://www.example.com")
print(result)
5. query_from_url
Perform a query based on the content of a specific URL.
Synchronous:
response = client_sync.query_from_url(
url="https://www.example.com/article",
query="What is the main topic of this article?",
response_language="en", # can be 'auto', 'en', 'fr', 'es', 'de', 'it', 'pt', 'nl', 'ja', 'ko', 'zh', 'ar', 'ru', 'tr', 'hi'
answer_type="text", # can be 'text', 'markdown', or 'html'
model="o3-mini-medium", # can be 'o3-mini-medium', 'o3-mini-high', 'gpt-4o', 'gpt-4o-mini'
)
print(response)
Asynchronous:
response = await client_async.query_from_url(
url="https://www.example.com/article",
query="What is the main topic of this article?",
response_language="en",
answer_type="text",
model="o3-mini-medium" # can be 'o3-mini-medium', 'o3-mini-high', 'gpt-4o', 'gpt-4o-mini'
)
print(response)
6. custom_search / custom_search_stream
Perform a custom search query with a system prompt and user prompt.
Synchronous:
# Non-streaming custom search
result = client_sync.custom_search(
system_prompt="You are a helpful assistant.",
user_prompt="Explain the theory of relativity",
location="us", # can be 'us', 'ca', 'uk',.... (see supported locations below)
model="o3-mini-medium", # can be 'o3-mini-medium', 'o3-mini-high', 'gpt-4o', 'gpt-4o-mini'
search_type="general", # can be 'news' or 'general'
return_images=False, # set to True to return images
return_sources=False, # set to True to return sources
temperature=0.2, # float value to control the randomness of the output
top_p=0.9, # float value to control the diversity of the output
recency_filter="anytime" # can be 'hour', 'day', 'week', 'month', 'year', 'anytime'
)
print(result)
# Streaming custom search
for chunk in client_sync.custom_search_stream(
system_prompt="You are a helpful assistant.",
user_prompt="Explain the theory of relativity",
location="us", # can be 'us', 'ca', 'uk',.... (see supported locations below)
model="o3-mini-medium", # can be 'o3-mini-medium', 'o3-mini-high', 'gpt-4o', 'gpt-4o-mini'
search_type="general",
return_images=False,
return_sources=False,
temperature=0.2,
top_p=0.9,
recency_filter="anytime" # can be 'hour', 'day', 'week', 'month', 'year', 'anytime'
):
print(chunk)
Asynchronous:
# Non-streaming custom search
result = await client_async.custom_search(
system_prompt="You are a helpful assistant.",
user_prompt="Explain the theory of relativity",
location="us",
model="o3-mini-medium", # can be 'o3-mini-medium', 'o3-mini-high', 'gpt-4o', 'gpt-4o-mini'
search_type="general",
return_images=False,
return_sources=False,
temperature=0.2,
top_p=0.9,
recency_filter="anytime" # can be 'hour', 'day', 'week', 'month', 'year', 'anytime'
)
print(result)
# Streaming custom search
async for chunk in client_async.custom_search_stream(
system_prompt="You are a helpful assistant.",
user_prompt="Explain the theory of relativity",
location="us", # can be 'us', 'ca', 'uk',.... (see supported locations below)
model="o3-mini-medium", # can be 'o3-mini-medium', 'o3-mini-high', 'gpt-4o', 'gpt-4o-mini'
search_type="general", # can be 'news' or 'general'
return_images=False, # set to True to return images
return_sources=False, # set to True to return sources
temperature=0.2, # float value to control the randomness of the output
top_p=0.9, # float value to control the diversity of the output
recency_filter="anytime" # can be 'hour', 'day', 'week', 'month', 'year', 'anytime'
):
print(chunk)
Parameters
Common Parameters for Search Methods
query: The search query or question.date_context: String Optional date for context (format: "today is 8 of october and time is 4 PM" or "YYYY-MM-DD HH:MM AM/PM"). If empty, the current date of the API server is used.location: Country code for search context. Default is "us".model: Model to use for the search. Options are "o3-mini-medium", "o3-mini-high", "gpt-4o", or "gpt-4o-mini"(default).response_language: Language code for the response. Default is "auto" (auto-detect).answer_type: Type of answer format. Options are "text" (default), "markdown", or "html".search_type: Type of search to perform (general or news). Default is "general".return_citations: Boolean to indicate whether to return citations. Default is False.return_sources: Boolean to indicate whether to return sources. Default is False.return_images: Boolean to indicate whether to return images. Default is False.recency_filter: Filter results by recency. Options are "hour", "day", "week", "month", "year", or "anytime". Default is "anytime".
Custom Search Parameters
system_prompt: The system prompt for custom search.user_prompt: The user prompt for custom search.model: Model to use for the search. Options are "o3-mini-medium", "o3-mini-high", "gpt-4o", or "gpt-4o-mini"(default).temperature: Float value to control the randomness of the output. Default is 0.2.top_p: Float value to control the diversity of the output. Default is 0.9.search_type: Type of search to perform (general or news). Default is "general".return_images: Boolean to indicate whether to return images. Default is False.return_sources: Boolean to indicate whether to return sources. Default is False.recency_filter: Filter results by recency. Options are "hour", "day", "week", "month", "year", or "anytime". Default is "anytime".
Supported Locations
The location parameter accepts the following country codes:
๐บ๐ธ us (United States), ๐จ๐ฆ ca (Canada), ๐ฌ๐ง uk (United Kingdom), ๐ฒ๐ฝ mx (Mexico), ๐ช๐ธ es (Spain), ๐ฉ๐ช de (Germany), ๐ซ๐ท fr (France), ๐ต๐น pt (Portugal), ๐ณ๐ฑ nl (Netherlands), ๐น๐ท tr (Turkey), ๐ฎ๐น it (Italy), ๐ต๐ฑ pl (Poland), ๐ท๐บ ru (Russia), ๐ฟ๐ฆ za (South Africa), ๐ฆ๐ช ae (United Arab Emirates), ๐ธ๐ฆ sa (Saudi Arabia), ๐ฆ๐ท ar (Argentina), ๐ง๐ท br (Brazil), ๐ฆ๐บ au (Australia), ๐จ๐ณ cn (China), ๐ฐ๐ท kr (Korea), ๐ฏ๐ต jp (Japan), ๐ฎ๐ณ in (India), ๐ต๐ธ ps (Palestine), ๐ฐ๐ผ kw (Kuwait), ๐ด๐ฒ om (Oman), ๐ถ๐ฆ qa (Qatar), ๐ฎ๐ฑ il (Israel), ๐ฒ๐ฆ ma (Morocco), ๐ช๐ฌ eg (Egypt), ๐ฎ๐ท ir (Iran), ๐ฑ๐พ ly (Libya), ๐พ๐ช ye (Yemen), ๐ฎ๐ฉ id (Indonesia), ๐ต๐ฐ pk (Pakistan), ๐ง๐ฉ bd (Bangladesh), ๐ฒ๐พ my (Malaysia), ๐ต๐ญ ph (Philippines), ๐น๐ญ th (Thailand), ๐ป๐ณ vn (Vietnam)
Supported Languages
The response_language parameter accepts the following language codes:
auto: Auto-detect the user question language (default)en: Englishfr: Frenches: Spanishde: Germanit: Italianpt: Portuguesenl: Dutchja: Japaneseko: Koreanzh: Chinesear: Arabicru: Russiantr: Turkishhi: Hindi
Best Practices
- Query Precision: Provide clear and concise queries to get accurate results.
- Custom Search: Use the
custom_searchorcustom_search_streamto write your own system and user prompts for more specific queries. always be specific with the user prompt since it will be used for the web search. Remember to include date context if needed in your system prompt. if you need citations, you must add the citation prompt in the System prompt. - API Key Security: Never hard-code your API key in your source code. Use environment variables or secure configuration management.
- Error Handling: Always implement proper error handling to manage API errors and network issues gracefully.
- Asynchronous Usage: For applications that need to handle multiple requests concurrently, consider using the asynchronous version of the client.
- Streaming Responses: When using
search_streamorcustom_search_stream, remember to handle the streaming nature of the response appropriately in your application. - model: Use the
modelparameter to specify the model to use for the search. The default model is "gpt-4o-mini". Other options are "o3-mini-high","o3-mini-medium", "gpt-4o", or "gpt-4o-mini". - Date Context: When historical context is important for your query, always specify the
date_contextparameter. Use the format "Today is Tuesday 19 of November 2024 and the time is 9:40 PM". - Localization: Use the
locationto get localized results. - Response Language: Use the
response_languageparameter to get responses in different languages. - Recency Filter: Use the
recency_filterparameter to filter results by recency. - Search Type: Use the
search_typeparameter to specify the type of search (general or news).
Error Handling
The library raises OpenperplexError exceptions for API errors. Always wrap your API calls in try-except blocks:
from openperplex import OpenperplexSync, OpenperplexError
try:
result = client_sync.search("AI advancements")
print(result)
except OpenperplexError as e:
print(f"An error occurred: {e}")
Remember to handle potential network errors and other exceptions as needed in your application.
Discord Community
- Join our Discord community to get help, share your projects, and discuss the latest updates: Openperplex Discord
Conclusion
The Openperplex Python library provides a powerful interface to access advanced search and web analysis capabilities. By leveraging its various methods and parameters, you can create sophisticated applications that can understand and process web content in multiple languages and contexts.
For any issues, feature requests, or further questions, please open an issue.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openperplex-0.3.0.tar.gz.
File metadata
- Download URL: openperplex-0.3.0.tar.gz
- Upload date:
- Size: 11.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4d21423e1937adee4c8cfc62eaf4662e28bc5fbff6769575a95993c5012ed591
|
|
| MD5 |
c6ca2922be0bdb61f4a71f9a3c22c863
|
|
| BLAKE2b-256 |
e006516958e166e6c61f27faafbfe0f43bd7d3cb979a85c986facd56c7c88a1d
|
File details
Details for the file openperplex-0.3.0-py3-none-any.whl.
File metadata
- Download URL: openperplex-0.3.0-py3-none-any.whl
- Upload date:
- Size: 7.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f9d2a3f2b6e681c022d4410630ea600d0bae48f41ca3de7c1b96e3512cf65b8a
|
|
| MD5 |
42d0f31862c6683688e1392d02d38e4b
|
|
| BLAKE2b-256 |
c8a10978c218aa0a838f33e8c3f4427682209480492f8a3611b9aab4e5029194
|