Skip to main content

A powerful web content fetcher and processor

Project description

Ailite

A lightweight Python interface for AI model interactions through Hugging Face's infrastructure.

Installation

pip install ailite

Usage

1. Initially SETUP Server Deployment with serve()

Launch your own API server:

from ailite import serve

# Start server on http://0.0.0.0:11435
serve()

1. Quick Start with ai()

The simplest way to get started:

from ailite import ai
response = ai("Explain quantum computing")
print(response)

2. Customization with ai()

from ailite import ai
response = ai(
    "Explain quantum computing",
    model="nvidia/Llama-3.1-Nemotron-70B-Instruct-HF",
    conversation=False
)

3. Streaming Response with ai()

from ailite import ai
# With streaming
for chunk in ai(
    "Write a story about space",
    stream=True
):
    print(chunk, end="")

4. Client Usage with HUGPIClient

For more control over interactions:

from ailite import HUGPIClient

client = HUGPIClient(
    api_key="your_email@gmail.com_your_password",
    model="nvidia/Llama-3.1-Nemotron-70B-Instruct-HF",
    system_prompt="You are a helpful assistant..."
)

# Generate text
response = client.messages.create(
    prompt="What is the theory of relativity?",
    conversation=True
)
print(response.content[0]["text"])

# Chat conversation
messages = [
    {"role": "user", "content": "Hi, how are you?"},
    {"role": "assistant", "content": "I'm doing well, how can I help?"},
    {"role": "user", "content": "Tell me about AI"}
]
response = client.messages.create(messages=messages)

5. Base Model with HUGPiLLM

For direct model interactions:

from ailite import HUGPiLLM

llm = HUGPiLLM(
    hf_email="your_email@gmail.com",
    hf_password="your_password",
    default_llm=3,  # Model index
    system_prompt="Custom system instructions here"
)

response = llm.generate("Explain machine learning")

Dependencies

fastapi>=0.68.0
pydantic>=1.8.0
uvicorn>=0.15.0
requests>=2.26.0

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ailite-5.0.2.tar.gz (35.7 kB view details)

Uploaded Source

Built Distribution

ailite-5.0.2-py3-none-any.whl (44.5 kB view details)

Uploaded Python 3

File details

Details for the file ailite-5.0.2.tar.gz.

File metadata

  • Download URL: ailite-5.0.2.tar.gz
  • Upload date:
  • Size: 35.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for ailite-5.0.2.tar.gz
Algorithm Hash digest
SHA256 87498ba3dbd41cc989920e486ebb4bc4279a9eca9d2e7ed5322598d7b4758b05
MD5 b821889dcd3eeaffe3634bf5d535cfa6
BLAKE2b-256 b696b7949001390305db430cd72062312d20e70852463de98fbb430533f84c7f

See more details on using hashes here.

File details

Details for the file ailite-5.0.2-py3-none-any.whl.

File metadata

  • Download URL: ailite-5.0.2-py3-none-any.whl
  • Upload date:
  • Size: 44.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for ailite-5.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 998c8f9bb8123eb0e3ae097a05e6d2261ae810f79f9c39dd8c1eb89126979bf9
MD5 6edb3dadf6eeb89a3190a341a79179b5
BLAKE2b-256 c30b227ebc46a26ccb2707021bcf652138a30292375d5cbb296ac3228ae2ddb5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page