Skip to main content

A simplistic ai for Huggingface llms free hosted models.

Project description

AILiteLLM

PyPI version Downloads License: MIT Python 3.8+ Documentation Status GitHub issues GitHub stars

AILiteLLM is a lightweight Python wrapper around the Hugging Face Inference API that provides OpenAI-like interface for various open-source language models. It makes it easy to use powerful open-source models with an interface familiar to OpenAI developers.

📊 Package Stats

Features

  • OpenAI-compatible interface
  • Support for multiple Hugging Face models
  • Stream responses support
  • Function calling capabilities
  • Full typing support
  • Easy model switching

Installation

From PyPI:

pip install ailitellm

From source:

git clone https://github.com/yourusername/ailitellm.git
cd ailitellm
pip install -e .

Development installation:

pip install -e ".[dev]"

Quick Start

from ailitellm import ai, ailite_model

# Simple completion
response = ai("What is the capital of France?")
print(response.choices[0].message.content)

# Using a specific model
response = ai(
    "Explain quantum computing",
    model=ailite_model("Qwen/Qwen2.5-72B-Instruct")
)

Available Models

AILiteLLM supports the following models:

  • Qwen/Qwen2.5-72B-Instruct - Large general purpose model
  • Qwen/QwQ-32B-Preview - Preview version of QwQ model
  • Qwen/Qwen2.5-Coder-32B-Instruct - Specialized for coding tasks
  • NousResearch/Hermes-3-Llama-3.1-8B - Efficient general purpose model
  • microsoft/Phi-3.5-mini-instruct - Lightweight instruction-following model

Advanced Usage

Chat Completions

messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Tell me a joke about programming."}
]

response = ai(messages)

Streaming Responses

for chunk in ai("Write a poem about AI", stream=True):
    print(chunk.choices[0].delta.content, end="")

Function Calling

def get_weather(location: str, unit: str = "celsius") -> str:
    """Get the weather for a location"""
    pass

response = ai(
    "What's the weather in London?",
    tools=[get_weather],
    tool_choice="auto"
)

Advanced Parameters

response = ai(
    "Generate a creative story",
    temperature=0.8,
    max_tokens=500,
    top_p=0.9,
    presence_penalty=0.6
)

Custom Client

from ailitellm import AILite

custom_client = AILite(
    base_url="your_custom_endpoint",
    api_key="your_api_key"
)

Error Handling

try:
    response = ai("Your prompt here")
except Exception as e:
    print(f"An error occurred: {e}")

API Reference

Main Functions

ai(messages_or_prompt, **kwargs)

Main interface for generating completions.

Key parameters:

  • messages_or_prompt: List of messages or string prompt
  • model: Model to use (default: "Qwen/Qwen2.5-72B-Instruct")
  • temperature: Sampling temperature (default: 0)
  • max_tokens: Maximum tokens to generate
  • stream: Enable streaming responses
  • tools: List of functions for tool calling
  • See source code for full list of parameters

ailite_model(model: HFModelType)

Helper function to specify model type with proper type checking.

Classes

AILite

Custom client class extending OpenAI's base client.

🧑‍💻 Development

To set up the development environment:

# Clone the repository
git clone https://github.com/yourusername/ailitellm.git
cd ailitellm

# Create a virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install development dependencies
pip install -e ".[dev]"

# Run tests
pytest tests/

📦 Dependencies

  • Python >= 3.8
  • openai >= 1.0.0
  • httpx
  • typing-extensions

Contributing

Contributions are welcome! Please feel free to submit a Pull Request. See CONTRIBUTING.md for guidelines.

📝 Citation

If you use AILiteLLM in your research, please cite:

@software{ailitellm2024,
  author = {Your Name},
  title = {AILiteLLM: OpenAI-compatible Interface for Hugging Face Models},
  year = {2024},
  publisher = {GitHub},
  url = {https://github.com/yourusername/ailitellm}
}

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • OpenAI for the API interface design
  • Hugging Face for model hosting and inference API
  • All model creators and contributors

📫 Contact

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ailiteall-0.0.1.tar.gz (15.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ailiteall-0.0.1-py3-none-any.whl (16.1 kB view details)

Uploaded Python 3

File details

Details for the file ailiteall-0.0.1.tar.gz.

File metadata

  • Download URL: ailiteall-0.0.1.tar.gz
  • Upload date:
  • Size: 15.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for ailiteall-0.0.1.tar.gz
Algorithm Hash digest
SHA256 4b46019e966ba118e11fdc7434501201ebf5bfe1b7449bd1474af13e18dc935e
MD5 e5ce2fb73841784ed2630d8c105769be
BLAKE2b-256 bfeb82fe74ea7a8695dc2c319478a8077b3184284e231960aa690bf63dc1a4f0

See more details on using hashes here.

File details

Details for the file ailiteall-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: ailiteall-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 16.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for ailiteall-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 65fd35f3ff633077d15232a1ff54506a84699b41ad230f7c8a008700fb002165
MD5 515c459af9eeb87447f15eaa6e00880b
BLAKE2b-256 819d8c093c07d9b9af5fc4e6f48ebe9c5406d14aa00f2be3ed28408ee771e5bb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page