Skip to main content

A powerful web content fetcher and processor

Project description

Ailite

A lightweight Python interface for AI model interactions through Hugging Face's infrastructure.

Installation

pip install ailite

Usage

1. Initially SETUP Server Deployment with serve()

Launch your own API server:

from ailite import serve

# Start server on http://0.0.0.0:11435
serve()

1. Quick Start with ai()

The simplest way to get started:

from ailite import ai
response = ai("Explain quantum computing")
print(response)

2. Customization with ai()

from ailite import ai
response = ai(
    "Explain quantum computing",
    model="nvidia/Llama-3.1-Nemotron-70B-Instruct-HF",
    conversation=False
)

3. Streaming Response with ai()

from ailite import ai
# With streaming
for chunk in ai(
    "Write a story about space",
    stream=True
):
    print(chunk, end="")

4. Client Usage with HUGPIClient

For more control over interactions:

from ailite import HUGPIClient

client = HUGPIClient(
    api_key="your_email@gmail.com_your_password",
    model="nvidia/Llama-3.1-Nemotron-70B-Instruct-HF",
    system_prompt="You are a helpful assistant..."
)

# Generate text
response = client.messages.create(
    prompt="What is the theory of relativity?",
    conversation=True
)
print(response.content[0]["text"])

# Chat conversation
messages = [
    {"role": "user", "content": "Hi, how are you?"},
    {"role": "assistant", "content": "I'm doing well, how can I help?"},
    {"role": "user", "content": "Tell me about AI"}
]
response = client.messages.create(messages=messages)

5. Base Model with HUGPiLLM

For direct model interactions:

from ailite import HUGPiLLM

llm = HUGPiLLM(
    hf_email="your_email@gmail.com",
    hf_password="your_password",
    default_llm=3,  # Model index
    system_prompt="Custom system instructions here"
)

response = llm.generate("Explain machine learning")

Dependencies

fastapi>=0.68.0
pydantic>=1.8.0
uvicorn>=0.15.0
requests>=2.26.0

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ailite-2.0.0.tar.gz (32.6 kB view details)

Uploaded Source

Built Distribution

ailite-2.0.0-py3-none-any.whl (40.6 kB view details)

Uploaded Python 3

File details

Details for the file ailite-2.0.0.tar.gz.

File metadata

  • Download URL: ailite-2.0.0.tar.gz
  • Upload date:
  • Size: 32.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for ailite-2.0.0.tar.gz
Algorithm Hash digest
SHA256 ca10ad966207ec12de76fdaf5113637a249088fa249bd983bed0d7af0b5ae7d3
MD5 2a84fceada9942bacf8bc5d325e7e533
BLAKE2b-256 8ff393649a7a31be6be83fc9ec1a43613c6001f5f7e4cec77e12485e1a96bf74

See more details on using hashes here.

File details

Details for the file ailite-2.0.0-py3-none-any.whl.

File metadata

  • Download URL: ailite-2.0.0-py3-none-any.whl
  • Upload date:
  • Size: 40.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for ailite-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3bb675ad579c5843cf4d187899aa01526924cf0a3bd350a6b72c6e84c4b425c3
MD5 ebb963132fd5c744569813b655708c0b
BLAKE2b-256 d0ba42775547cb5967317f00e4f1c7a765e82476c1f865097cc88eda3104cfec

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page