Skip to main content

Hyperbeam is a Python library designed to provide intelligent search tooling.

Project description

hyperbeam

Hyperbeam is a Python library designed to provide intelligent search tooling.

Features

  • Search via DuckDuckGo for text, news, videos, and images.
  • Optional integration with ScraperAPI for proxied requests.
  • Standardized output schema for search results across different modes.
  • Site-specific search limiting.
  • LLM-powered guided search query generation.

Installation

You can install hyperbeam directly from PyPI using pip (or any pip-compatible package manager like uv):

pip install hyperbeam

Or using uv:

uv pip install hyperbeam

Prerequisites for Usage:

  • Python 3.10+
  • If you plan to use the ScraperAPI integration: a ScraperAPI API key set as the environment variable SCRAPERAPI_API_KEY.
  • For guided_search_queries: OpenAI API key (OPENAI_API_KEY) and/or Groq API key (GROQ_API_KEY) set as environment variables, depending on the chosen LLM.

Usage

Once installed, you can import and use functions from the hyperbeam package:

Web Search

The web_search function allows you to perform searches using DuckDuckGo:

from hyperbeam import web_search, ddgs_scraperapi_patch

# To use standard DuckDuckGo search (no ScraperAPI):
# Perform a text search
text_results = web_search(keywords="latest advancements in AI", mode="text")
for result in text_results[:2]: # Print first two results
    print(result)

# Perform a news search for the last week
news_results = web_search(keywords="python programming news", mode="news", timeframe="w")
for result in news_results[:2]:
    print(result)

# To use ScraperAPI for proxied requests:
# 1. Ensure the SCRAPERAPI_API_KEY environment variable is set.
# 2. Call the patch function *once* in your application startup.
try:
    ddgs_scraperapi_patch()
    print("ScraperAPI patch applied successfully.")
except ValueError as e:
    print(f"ScraperAPI patch could not be applied: {e}")
    print("Proceeding without ScraperAPI.")

# Example call after attempting to patch (will use ScraperAPI if patch was successful and key was set):
video_results_via_scraper = web_search(keywords="uv python tutorial", mode="video", timeframe = "y")
if video_results_via_scraper:
    print("\nVideo results (potentially via ScraperAPI if patched):")
    for result in video_results_via_scraper[:5]:
        print(result)

Guided Search Query Generation

The guided_search_queries function uses a Large Language Model (LLM) to generate a list of diverse search queries based on an initial user message. This can help in exploring different facets of a search topic.

Prerequisites:

  • Ensure OPENAI_API_KEY (for GPT models) or GROQ_API_KEY (for Llama models via Groq) environment variables are set.
from hyperbeam import guided_search_queries
from hyperbeam.typing import Message # Or define your own Message structure if not importing

# Example messages (replace with your actual message history)
messages: list[Message] = [
    {"role": "user", "content": "What are the best ways to learn a new programming language?"},
    # Add more messages if relevant to your use case, 
    # though guided_search_queries currently uses the last message.
]

try:
    suggested_queries = guided_search_queries(messages=messages)
    print("\nSuggested search queries:")
    for query_info in suggested_queries:
        print(query_info)
except Exception as e:
    print(f"Error generating guided search queries: {e}")

# Example with a Llama model via Groq (ensure GROQ_API_KEY is set)
# from hyperbeam.constants import GUIDED_SEARCH_MODEL # Default is GPT
# messages_for_llama: list[Message] = [
#     {"role": "user", "content": "planning a trip to Kyoto"},
# ]
# try:
#     # You might need to adjust GUIDED_SEARCH_MODEL in constants.py 
#     # or pass the model directly if the function signature allows
#     suggested_queries_llama = guided_search_queries(
#         messages=messages_for_llama, 
#         llm_model="llama3-8b-8192" # Example Llama model available on Groq
#     )
#     print("\nSuggested_queries_llama search queries:")
#     for query_info in suggested_queries_llama:
#         print(query_info)
# except Exception as e:
#     print(f"Error generating guided search queries with Llama: {e}")

For more detailed examples, including how to set up and use the ScraperAPI patch effectively, refer to the example notebooks or documentation within the repository (once available).

Development Setup

If you want to contribute to hyperbeam or install it for development purposes:

  1. Clone the repository:

    git clone https://github.com/hyprbm/hyperbeam.git
    cd hyperbeam
    
  2. Recommended: Set up a virtual environment using uv (or your preferred tool):

    uv venv # Create a virtual environment (e.g., .venv)
    source .venv/bin/activate # Activate (Linux/macOS)
    # For Windows (PowerShell): .venv\Scripts\Activate.ps1
    # For Windows (CMD): .venv\Scripts\activate.bat
    
  3. Install in editable mode with development dependencies: This project uses pyproject.toml for packaging.

    uv pip install -e .[development]
    

    The [development] extra includes tools like linters and formatters (e.g., black, flake8, isort).

  4. Set up Environment Variables for Development (Optional): If you'll be testing the ScraperAPI integration during development, create a .env file in the project root:

    SCRAPERAPI_API_KEY="your_actual_api_key_here"
    OPENAI_API_KEY="your_openai_api_key_here"
    GROQ_API_KEY="your_groq_api_key_here"
    

    The library itself (when used as a package) relies on the environment variable being set directly in the execution environment, but for development, a .env file can be convenient if you use a tool that loads it (like python-dotenv specified in dev dependencies, often used in test runners or scripts).

Contributing

(Optional: Add details about running tests, linters, or specific contribution guidelines here if applicable.)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperbeam-0.1.3.tar.gz (16.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hyperbeam-0.1.3-py3-none-any.whl (14.5 kB view details)

Uploaded Python 3

File details

Details for the file hyperbeam-0.1.3.tar.gz.

File metadata

  • Download URL: hyperbeam-0.1.3.tar.gz
  • Upload date:
  • Size: 16.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.13

File hashes

Hashes for hyperbeam-0.1.3.tar.gz
Algorithm Hash digest
SHA256 7b76c78bf13329d574a854d67cde0ba835b0a53df3fdf572b3a8f2b17a50855e
MD5 879265e634423dfe9523358a3b5fa23a
BLAKE2b-256 df32f0af9ef566903b8b36931e814ac7e57f3c746d96a55d1ad4159634878a0d

See more details on using hashes here.

File details

Details for the file hyperbeam-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: hyperbeam-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 14.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.13

File hashes

Hashes for hyperbeam-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 6d75acad022c9856cbde934582f3691fd12ed7e622850c0a7a20a7da2cd6bac6
MD5 ffa97f7ac3959fe9cdd2067acc2c86f6
BLAKE2b-256 c9574da1e31628c7fe3e5f8adc84307a07f1d989e232c86c79b2d3f6999e2dcd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page