Skip to main content

No project description provided

Project description

README for Polar Llama

Logo

Overview

Polar Llama is a Python library designed to enhance the efficiency of making parallel inference calls to the ChatGPT API using the Polars dataframe tool. This library enables users to manage multiple API requests simultaneously, significantly speeding up the process compared to serial request handling.

Key Features

  • Parallel Inference: Send multiple inference requests in parallel to the ChatGPT API without waiting for each individual request to complete.
  • Integration with Polars: Utilizes the Polars dataframe for organizing and handling requests, leveraging its efficient data processing capabilities.
  • Easy to Use: Simplifies the process of sending queries and retrieving responses from the ChatGPT API through a clean and straightforward interface.

Installation

To install Polar Llama, use will need to execute the folloiwng bash command:

maturin develop

I will be making the package available on PyPI soon.

Example Usage

Here’s how you can use Polar Llama to send multiple inference requests in parallel:

import polars as pl
from polar_llama import string_to_message, inference_async, Provider
import dotenv

dotenv.load_dotenv()

# Example questions
questions = [
    'What is the capital of France?',
    'What is the difference between polars and pandas?'
]

# Creating a dataframe with questions
df = pl.DataFrame({'Questions': questions})

# Adding prompts to the dataframe
df = df.with_columns(
    prompt=string_to_message("Questions", message_type='user')
)

# Sending parallel inference requests
df = df.with_columns(
    answer=inference_async('prompt', provider = Provider.OPENAI, model = 'gpt-4o-mini')
)

Benefits

  • Speed: Processes multiple queries in parallel, drastically reducing the time required for bulk query handling.
  • Scalability: Scales efficiently with the increase in number of queries, ideal for high-demand applications.
  • Ease of Integration: Integrates seamlessly into existing Python projects that utilize Polars, making it easy to add parallel processing capabilities.

Contributing

We welcome contributions to Polar Llama! If you're interested in improving the library or adding new features, please feel free to fork the repository and submit a pull request.

License

Polar Llama is released under the MIT license. For more details, see the LICENSE file in the repository.

Roadmap

  • Reponse Handling: Implement mechanisms to handle and process responses from parallel inference calls.
  • Support for Additional APIs: Extend support to other APIs for parallel inference processing.
  • Function Calling: Add support for using the function calls and structured data outputs for inference requests.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

polar_llama-0.1.5.tar.gz (170.9 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

polar_llama-0.1.5-cp38-abi3-win_amd64.whl (6.2 MB view details)

Uploaded CPython 3.8+Windows x86-64

polar_llama-0.1.5-cp38-abi3-manylinux_2_34_x86_64.whl (7.6 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.34+ x86-64

polar_llama-0.1.5-cp38-abi3-macosx_11_0_arm64.whl (6.4 MB view details)

Uploaded CPython 3.8+macOS 11.0+ ARM64

polar_llama-0.1.5-cp38-abi3-macosx_10_12_x86_64.whl (6.8 MB view details)

Uploaded CPython 3.8+macOS 10.12+ x86-64

File details

Details for the file polar_llama-0.1.5.tar.gz.

File metadata

  • Download URL: polar_llama-0.1.5.tar.gz
  • Upload date:
  • Size: 170.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: maturin/1.8.2

File hashes

Hashes for polar_llama-0.1.5.tar.gz
Algorithm Hash digest
SHA256 f67afc9b5b17cb344477a5af821906a37acdfa1e846b6a3e931863860557bb1c
MD5 f2337fb2435ba45973be90595160f0be
BLAKE2b-256 0ca58c0669a3e2909a6a4a25ff6e2b37f52ad06bb75c395fb558c2622d868abd

See more details on using hashes here.

File details

Details for the file polar_llama-0.1.5-cp38-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for polar_llama-0.1.5-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 99d921ebe05d6eb71178f04f4b36db9bbca91eeae86938f0a85e0c5258ab9735
MD5 6b715945702d76dbbf406a54b3c14def
BLAKE2b-256 9b048c635350aab93f9aaf12939b1d94a7c88e360c2ab2c4fc12a9ea529b5467

See more details on using hashes here.

File details

Details for the file polar_llama-0.1.5-cp38-abi3-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for polar_llama-0.1.5-cp38-abi3-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 aafd1cb5d628223be525d9c271250a476c417c5a9fa11f76caabb4b86b788ada
MD5 f90465079d74b063a6b8fd1ba278d39e
BLAKE2b-256 d17acb80d22ab35873336137fd2d7daeef2bb569fb01dc9c6ffcb97928d961e3

See more details on using hashes here.

File details

Details for the file polar_llama-0.1.5-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for polar_llama-0.1.5-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 d2245f70fd650c05623fcd2e6e60fa5ed8698706683ce4b64ff4515320976acb
MD5 aadc3bf6dc27e1950a58228924595924
BLAKE2b-256 60223bb4783a7e2251039c6615cccb5a5227a272c2803b8418e6c73709778321

See more details on using hashes here.

File details

Details for the file polar_llama-0.1.5-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for polar_llama-0.1.5-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 ba316863174e6fa66b8dde404f7072afa0fac7c3b18f42d7cdf6ebb9a2122519
MD5 e41e970f276deabac0214204404a9135
BLAKE2b-256 f4d63b4554bc3fbc186f4697635cac85118ceb5f4682a179708cbdbabbf71c3f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page