Skip to main content

No project description provided

Project description

README for Polar Llama

Logo

Overview

Polar Llama is a Python library designed to enhance the efficiency of making parallel inference calls to the ChatGPT API using the Polars dataframe tool. This library enables users to manage multiple API requests simultaneously, significantly speeding up the process compared to serial request handling.

Key Features

  • Parallel Inference: Send multiple inference requests in parallel to the ChatGPT API without waiting for each individual request to complete.
  • Integration with Polars: Utilizes the Polars dataframe for organizing and handling requests, leveraging its efficient data processing capabilities.
  • Easy to Use: Simplifies the process of sending queries and retrieving responses from the ChatGPT API through a clean and straightforward interface.

Installation

To install Polar Llama, use will need to execute the folloiwng bash command:

maturin develop

I will be making the package available on PyPI soon.

Example Usage

Here’s how you can use Polar Llama to send multiple inference requests in parallel:

import polars as pl
from polar_llama import string_to_message, inference_async, Provider
import dotenv

dotenv.load_dotenv()

# Example questions
questions = [
    'What is the capital of France?',
    'What is the difference between polars and pandas?'
]

# Creating a dataframe with questions
df = pl.DataFrame({'Questions': questions})

# Adding prompts to the dataframe
df = df.with_columns(
    prompt=string_to_message("Questions", message_type='user')
)

# Sending parallel inference requests
df = df.with_columns(
    answer=inference_async('prompt', provider = Provider.OPENAI, model = 'gpt-4o-mini')
)

Benefits

  • Speed: Processes multiple queries in parallel, drastically reducing the time required for bulk query handling.
  • Scalability: Scales efficiently with the increase in number of queries, ideal for high-demand applications.
  • Ease of Integration: Integrates seamlessly into existing Python projects that utilize Polars, making it easy to add parallel processing capabilities.

Contributing

We welcome contributions to Polar Llama! If you're interested in improving the library or adding new features, please feel free to fork the repository and submit a pull request.

License

Polar Llama is released under the MIT license. For more details, see the LICENSE file in the repository.

Roadmap

  • Reponse Handling: Implement mechanisms to handle and process responses from parallel inference calls.
  • Support for Additional APIs: Extend support to other APIs for parallel inference processing.
  • Function Calling: Add support for using the function calls and structured data outputs for inference requests.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

polar_llama-0.1.0.tar.gz (167.7 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

polar_llama-0.1.0-cp38-abi3-win_amd64.whl (6.2 MB view details)

Uploaded CPython 3.8+Windows x86-64

polar_llama-0.1.0-cp38-abi3-manylinux_2_34_x86_64.whl (7.6 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.34+ x86-64

polar_llama-0.1.0-cp38-abi3-macosx_11_0_arm64.whl (6.4 MB view details)

Uploaded CPython 3.8+macOS 11.0+ ARM64

polar_llama-0.1.0-cp38-abi3-macosx_10_12_x86_64.whl (6.8 MB view details)

Uploaded CPython 3.8+macOS 10.12+ x86-64

File details

Details for the file polar_llama-0.1.0.tar.gz.

File metadata

  • Download URL: polar_llama-0.1.0.tar.gz
  • Upload date:
  • Size: 167.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: maturin/1.8.2

File hashes

Hashes for polar_llama-0.1.0.tar.gz
Algorithm Hash digest
SHA256 0e4f54c1ad5127d4804c1a01acb64f9cf5f1cbb3468300167872c2b683d79156
MD5 882ffbdb9d3fbd9e90565d09f85222cf
BLAKE2b-256 41b8d5065a633704b52e112a006f21bbd465504a11f4c0cfb1ee8d9b3b4746bb

See more details on using hashes here.

File details

Details for the file polar_llama-0.1.0-cp38-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for polar_llama-0.1.0-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 7392f3377d084c02e9b1b78c557a4d19909fb84cfbd72de5fcefb87bd8d5e58d
MD5 2bee0acc5ae5e24c91b596b641e632df
BLAKE2b-256 d9a0678c93469af3e96d298f47adf64fca1421891d3b9a967cca1cff743d640b

See more details on using hashes here.

File details

Details for the file polar_llama-0.1.0-cp38-abi3-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for polar_llama-0.1.0-cp38-abi3-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 95bf7919e8e6ac142aef30b82d7c7204ebe2b84a0dab706539520cc3e08bdcfd
MD5 d2c980cf70d3a5a6f7f14273fba6af4c
BLAKE2b-256 a495174a9688d252eb7a45b8f5cc02f4e99182d6787461504366363d9ebeb88f

See more details on using hashes here.

File details

Details for the file polar_llama-0.1.0-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for polar_llama-0.1.0-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 4981c8183922a2c76e75f258e36205d0335ab07d615cf1f6d0a741905299b8b4
MD5 091a14a0606e94a9c99c6680dbd66481
BLAKE2b-256 6c9d9c61b3b2377cace76e73e3913044f621dfe26a9594a6dc4d51c7f63d3944

See more details on using hashes here.

File details

Details for the file polar_llama-0.1.0-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for polar_llama-0.1.0-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 b5172ff4118b76b0a26b57c7032f2f29c72125d9bfaa91fe416af31a39d12953
MD5 69797c47004e028cfc491ea8281217f3
BLAKE2b-256 071b214a9916bf0e73b12717829a127c0dc70390c96dd7ec09f1b165fc2f5dc5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page