Skip to main content

High-Performance Asynchronous HTTP Client setting Requests on Fire 🔥

Project description

FireRequests 🔥

GitHub release PyPi version PyPI Downloads Open In Colab

FireRequests is a high-performance, asynchronous HTTP client library for Python, engineered to accelerate your file transfers. By harnessing advanced concepts like semaphores, exponential backoff with jitter, concurrency, and fault tolerance, FireRequests can achieve up to a 10x real-world speedup in file downloads and uploads compared to traditional synchronous methods and enables scalable, parallelized LLM interactions with providers like OpenAI and Google.

Features 🚀

  • Asynchronous I/O: Non-blocking network and file operations using asyncio, aiohttp, and aiofiles, boosting throughput for I/O-bound tasks.
  • Concurrent Transfers: Uses asyncio.Semaphore to limit simultaneous tasks, optimizing performance by managing system resources effectively.
  • Fault Tolerance: Retries failed tasks with exponentially increasing wait times, adding random jitter to prevent network congestion.
  • Chunked Processing: Files are split into configurable chunks for parallel processing, significantly accelerating uploads/downloads.
  • Parallelized LLM Generation: Efficiently handles large-scale language model requests from OpenAI and Google with configurable parallelism.
  • Compatibility: Supports environments like Jupyter through nest_asyncio, enabling reusable asyncio loops for both batch and interactive Jupyter use.

Installation 📦

Install FireRequests using pip:

!pip install firerequests

Quick Start 🏁

Accelerate your downloads with just a few lines of code:

Python Usage

from firerequests import FireRequests

url = "https://mirror.clarkson.edu/zorinos/isos/17/Zorin-OS-17.2-Core-64-bit.iso"

fr = FireRequests()
fr.download(url)

Command Line Interface

!fr download https://mirror.clarkson.edu/zorinos/isos/17/Zorin-OS-17.2-Core-64-bit.iso

Parameters:

  • urls (required): The URL to download the file from.
  • --filenames (optional): The name to save the downloaded file. Defaults to filename from URL.
  • --max_files (optional): The number of concurrent file chunks. Defaults to 10.
  • --chunk_size (optional): The size of each chunk in bytes. Defaults to 2 * 1024 * 1024 (2 MB).
  • --headers (optional): A dictionary of headers to include in the download request.
  • --show_progress (optional): Whether to show a progress bar. Defaults to True for single file downloads, and False for multiple files.

Real-World Speed Test 🏎️

FireRequests delivers significant performance improvements over traditional download methods. Below is the result of a real-world speed test:

Normal Download 🐌: 100%|██████████| 3.42G/3.42G [18:24<00:00, 3.10MB/s]
Downloading on 🔥: 100%|██████████| 3.42G/3.42G [02:38<00:00, 21.6MB/s]

🐌 Download Time: 1104.84 seconds
🔥 Download Time: 158.22 seconds

[!TIP] For Hugging Face Hub downloads it is recommended to use hf_transfer for maximum speed gains! For more details, please take a look at this section.

Advanced Usage ⚙️

Downloading Files

from firerequests import FireRequests

urls = ["https://example.com/file1.iso", "https://example.com/file2.iso"]
filenames = ["file1.iso", "file2.iso"]

fr = FireRequests()
fr.download(urls, filenames, max_files=10, chunk_size=2 * 1024 * 1024, headers={"Authorization": "Bearer token"}, show_progress=True)
  • urls: The URL or list of URLs of the file(s) to download.
  • filenames: The filename(s) to save the downloaded file(s). If not provided, filenames are extracted from the URLs.
  • max_files: The maximum number of concurrent chunk downloads. Defaults to 10.
  • chunk_size: The size of each chunk in bytes. Defaults to 2 * 1024 * 1024 (2 MB).
  • headers: A dictionary of headers to include in the download request (optional).
  • show_progress: Whether to show a progress bar during download. Defaults to True for a single file, and False for multiple files (optional).

Uploading Files

from firerequests import FireRequests

file_path = "largefile.iso"
parts_urls = ["https://example.com/upload_part1", "https://example.com/upload_part2", ...]

fr = FireRequests()
fr.upload(file_path, parts_urls, chunk_size=2 * 1024 * 1024, max_files=10, show_progress=True)
  • file_path: The local path to the file to upload.
  • parts_urls: A list of URLs where each part of the file will be uploaded.
  • chunk_size: The size of each chunk in bytes. Defaults to 2 * 1024 * 1024 (2 MB).
  • max_files: The maximum number of concurrent chunk uploads. Defaults to 10.
  • show_progress: Whether to show a progress bar during upload. Defaults to True.

Comparing Download Speed

from firerequests import FireRequests

url = "https://example.com/largefile.iso"

fr = FireRequests()
fr.compare(url)

Generating Text with LLMs

FireRequests supports generating responses from LLMs like OpenAI’s and Google’s generative models in parallel batches.

from firerequests import FireRequests

# Initialize FireRequests
fr = FireRequests()

# Set parameters
provider = "openai"
model = "gpt-4o-mini"
system_prompt = "Provide concise answers."
user_prompts = ["What is AI?", "Explain quantum computing.", "What is Bitcoin?", "Explain neural networks."]
parallel_requests = 2

# Generate responses
responses = fr.generate(
    provider=provider,
    model=model,
    system_prompt=system_prompt,
    user_prompts=user_prompts,
    parallel_requests=parallel_requests
)

print(responses)

License 📄

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

Sponsors ❤️

Become a sponsor and get a logo here. The funds are used to defray the cost of development.

bmc-button

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

firerequests-0.1.1.tar.gz (15.4 kB view details)

Uploaded Source

Built Distribution

firerequests-0.1.1-py3-none-any.whl (13.6 kB view details)

Uploaded Python 3

File details

Details for the file firerequests-0.1.1.tar.gz.

File metadata

  • Download URL: firerequests-0.1.1.tar.gz
  • Upload date:
  • Size: 15.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for firerequests-0.1.1.tar.gz
Algorithm Hash digest
SHA256 b0a7da072fbc323c94193322824e0a6da755e8ff1065148b5137dd671ad46395
MD5 b618508dced9168d801965a46c8f134b
BLAKE2b-256 b26bbdd0c9a43ff5711ead319d518fc6fb1b4488c2ceffcfc587b40ab240c8a6

See more details on using hashes here.

File details

Details for the file firerequests-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for firerequests-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f4bb29ad76eaf5692fd806969ee1d9c6a90a3e98806199c1687f23720a09fda6
MD5 c26f8cb6383ddbdab70294aac49acfbe
BLAKE2b-256 b3152dd00c67cbaf134d9164dfa453ba151562d1ae28651d7c218bb4bc43602c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page