Skip to main content

No project description provided

Project description

README for Polar Llama

Logo

Overview

Polar Llama is a Python library designed to enhance the efficiency of making parallel inference calls to the ChatGPT API using the Polars dataframe tool. This library enables users to manage multiple API requests simultaneously, significantly speeding up the process compared to serial request handling.

Key Features

  • Parallel Inference: Send multiple inference requests in parallel to the ChatGPT API without waiting for each individual request to complete.
  • Integration with Polars: Utilizes the Polars dataframe for organizing and handling requests, leveraging its efficient data processing capabilities.
  • Easy to Use: Simplifies the process of sending queries and retrieving responses from the ChatGPT API through a clean and straightforward interface.

Installation

To install Polar Llama, use will need to execute the folloiwng bash command:

maturin develop

I will be making the package available on PyPI soon.

Example Usage

Here’s how you can use Polar Llama to send multiple inference requests in parallel:

import polars as pl
from polar_llama import string_to_message, inference_async, Provider
import dotenv

dotenv.load_dotenv()

# Example questions
questions = [
    'What is the capital of France?',
    'What is the difference between polars and pandas?'
]

# Creating a dataframe with questions
df = pl.DataFrame({'Questions': questions})

# Adding prompts to the dataframe
df = df.with_columns(
    prompt=string_to_message("Questions", message_type='user')
)

# Sending parallel inference requests
df = df.with_columns(
    answer=inference_async('prompt', provider = Provider.OPENAI, model = 'gpt-4o-mini')
)

Benefits

  • Speed: Processes multiple queries in parallel, drastically reducing the time required for bulk query handling.
  • Scalability: Scales efficiently with the increase in number of queries, ideal for high-demand applications.
  • Ease of Integration: Integrates seamlessly into existing Python projects that utilize Polars, making it easy to add parallel processing capabilities.

Contributing

We welcome contributions to Polar Llama! If you're interested in improving the library or adding new features, please feel free to fork the repository and submit a pull request.

License

Polar Llama is released under the MIT license. For more details, see the LICENSE file in the repository.

Roadmap

  • Reponse Handling: Implement mechanisms to handle and process responses from parallel inference calls.
  • Support for Additional APIs: Extend support to other APIs for parallel inference processing.
  • Function Calling: Add support for using the function calls and structured data outputs for inference requests.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

polar_llama-0.1.4.tar.gz (168.6 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

polar_llama-0.1.4-cp38-abi3-win_amd64.whl (6.2 MB view details)

Uploaded CPython 3.8+Windows x86-64

polar_llama-0.1.4-cp38-abi3-manylinux_2_34_x86_64.whl (7.6 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.34+ x86-64

polar_llama-0.1.4-cp38-abi3-macosx_11_0_arm64.whl (6.4 MB view details)

Uploaded CPython 3.8+macOS 11.0+ ARM64

polar_llama-0.1.4-cp38-abi3-macosx_10_12_x86_64.whl (6.7 MB view details)

Uploaded CPython 3.8+macOS 10.12+ x86-64

File details

Details for the file polar_llama-0.1.4.tar.gz.

File metadata

  • Download URL: polar_llama-0.1.4.tar.gz
  • Upload date:
  • Size: 168.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: maturin/1.8.2

File hashes

Hashes for polar_llama-0.1.4.tar.gz
Algorithm Hash digest
SHA256 96bc73466aadc2fdaf5fc2d88754d16cfb1a03ec9e86fd873f02f56823969f34
MD5 2599ab1c0537504cfe133301cf17e93d
BLAKE2b-256 ebb6cd411e6e9a938ce9b572146c26089c8116f826031dee10baf9d577c84460

See more details on using hashes here.

File details

Details for the file polar_llama-0.1.4-cp38-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for polar_llama-0.1.4-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 db7b0eb36bb49fe22c41bdb71e9a30d07bf1f73011d72aa06dd6aab07d5ae814
MD5 2d52853a42485a847ced023cb41e75e3
BLAKE2b-256 b79ca1acfc42a2fd816a751df06650e17806dbe3ebda63599775846133f6ef85

See more details on using hashes here.

File details

Details for the file polar_llama-0.1.4-cp38-abi3-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for polar_llama-0.1.4-cp38-abi3-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 c41740978508537f37f07b6cce6153c99dd6ce1e57ca7202b5750ba45a8ee446
MD5 25cec2303e820202d5d1c2a1d43585f0
BLAKE2b-256 640dfe3f5e0a11d53424dad4e2700fbdef3c183865b50a8e31acf4941ce8e8b4

See more details on using hashes here.

File details

Details for the file polar_llama-0.1.4-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for polar_llama-0.1.4-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 f4e1177ea8809222c30e03f501eec23f30ce8eb6799f4fdce784c0150d623240
MD5 c3d31c90b33d9b8b86ca5fe71515f736
BLAKE2b-256 3b0dc76f5a2baec6d098cfa25998c6bd862c21f91cbeaa279ee377e2f00fc687

See more details on using hashes here.

File details

Details for the file polar_llama-0.1.4-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for polar_llama-0.1.4-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 172865882eaa0ad3eeb312564c8a40789d6051ea0ec0a42fdaf7b8fcbf823573
MD5 9cb4a767fda56fe3833ecf168ed932ec
BLAKE2b-256 8aedd7dd4e9b3022feb9fb9e01c875a7219ad52c7a3bc2ccadb795da67905982

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page