Skip to main content

A simplistic ai for Huggingface llms free hosted models.

Project description

AILiteLLM

PyPI version Downloads License: MIT Python 3.8+ Documentation Status GitHub issues GitHub stars

AILiteLLM is a lightweight Python wrapper around the Hugging Face Inference API that provides OpenAI-like interface for various open-source language models. It makes it easy to use powerful open-source models with an interface familiar to OpenAI developers.

📊 Package Stats

Features

  • OpenAI-compatible interface
  • Support for multiple Hugging Face models
  • Stream responses support
  • Function calling capabilities
  • Full typing support
  • Easy model switching

Installation

From PyPI:

pip install ailitellm

From source:

git clone https://github.com/yourusername/ailitellm.git
cd ailitellm
pip install -e .

Development installation:

pip install -e ".[dev]"

Quick Start

from ailitellm import ai, ailite_model

# Simple completion
response = ai("What is the capital of France?")
print(response.choices[0].message.content)

# Using a specific model
response = ai(
    "Explain quantum computing",
    model=ailite_model("Qwen/Qwen2.5-72B-Instruct")
)

Available Models

AILiteLLM supports the following models:

  • Qwen/Qwen2.5-72B-Instruct - Large general purpose model
  • Qwen/QwQ-32B-Preview - Preview version of QwQ model
  • Qwen/Qwen2.5-Coder-32B-Instruct - Specialized for coding tasks
  • NousResearch/Hermes-3-Llama-3.1-8B - Efficient general purpose model
  • microsoft/Phi-3.5-mini-instruct - Lightweight instruction-following model

Advanced Usage

Chat Completions

messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Tell me a joke about programming."}
]

response = ai(messages)

Streaming Responses

for chunk in ai("Write a poem about AI", stream=True):
    print(chunk.choices[0].delta.content, end="")

Function Calling

def get_weather(location: str, unit: str = "celsius") -> str:
    """Get the weather for a location"""
    pass

response = ai(
    "What's the weather in London?",
    tools=[get_weather],
    tool_choice="auto"
)

Advanced Parameters

response = ai(
    "Generate a creative story",
    temperature=0.8,
    max_tokens=500,
    top_p=0.9,
    presence_penalty=0.6
)

Custom Client

from ailitellm import AILite

custom_client = AILite(
    base_url="your_custom_endpoint",
    api_key="your_api_key"
)

Error Handling

try:
    response = ai("Your prompt here")
except Exception as e:
    print(f"An error occurred: {e}")

API Reference

Main Functions

ai(messages_or_prompt, **kwargs)

Main interface for generating completions.

Key parameters:

  • messages_or_prompt: List of messages or string prompt
  • model: Model to use (default: "Qwen/Qwen2.5-72B-Instruct")
  • temperature: Sampling temperature (default: 0)
  • max_tokens: Maximum tokens to generate
  • stream: Enable streaming responses
  • tools: List of functions for tool calling
  • See source code for full list of parameters

ailite_model(model: HFModelType)

Helper function to specify model type with proper type checking.

Classes

AILite

Custom client class extending OpenAI's base client.

🧑‍💻 Development

To set up the development environment:

# Clone the repository
git clone https://github.com/yourusername/ailitellm.git
cd ailitellm

# Create a virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install development dependencies
pip install -e ".[dev]"

# Run tests
pytest tests/

📦 Dependencies

  • Python >= 3.8
  • openai >= 1.0.0
  • httpx
  • typing-extensions

Contributing

Contributions are welcome! Please feel free to submit a Pull Request. See CONTRIBUTING.md for guidelines.

📝 Citation

If you use AILiteLLM in your research, please cite:

@software{ailitellm2024,
  author = {Your Name},
  title = {AILiteLLM: OpenAI-compatible Interface for Hugging Face Models},
  year = {2024},
  publisher = {GitHub},
  url = {https://github.com/yourusername/ailitellm}
}

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • OpenAI for the API interface design
  • Hugging Face for model hosting and inference API
  • All model creators and contributors

📫 Contact

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ailiteall-0.0.4.tar.gz (16.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ailiteall-0.0.4-py3-none-any.whl (18.0 kB view details)

Uploaded Python 3

File details

Details for the file ailiteall-0.0.4.tar.gz.

File metadata

  • Download URL: ailiteall-0.0.4.tar.gz
  • Upload date:
  • Size: 16.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for ailiteall-0.0.4.tar.gz
Algorithm Hash digest
SHA256 2bfb3cf5db056074a959726bbfe291a8d7e82e810e4a28f130f921b6848bc87d
MD5 42476d26078f930a098c49d4286caa83
BLAKE2b-256 863bad59e1f51106c464b59d32423fc2eb8e44abaa27fa3299af706e45ec3dbc

See more details on using hashes here.

File details

Details for the file ailiteall-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: ailiteall-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 18.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for ailiteall-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 e2b2e30770d6301fd908201dcda52597626f6421d733890a28d2c1dcd5f5b4c9
MD5 5450a78a47e1ac0ea55a4e4f930c38c1
BLAKE2b-256 d3f637f8ae45bd0f6446c39af239c4800e826b03024b3e33e3c2dc635542b097

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page