Skip to main content

Python SDK for Lexa AI - OpenAI-compatible interface for Lexa's language models

Project description

Lexa Python SDK

PyPI version Python 3.8+ License: MIT

A Python SDK for Lexa AI that provides an OpenAI-compatible interface for easy integration with Lexa's language models. Built with automatic SSL configuration and zero-setup installation.

✨ Features

  • 🔗 OpenAI-Compatible: Drop-in replacement for OpenAI SDK
  • 🚀 Async Support: Full async/await support for high-performance applications
  • 📦 Type Safety: Comprehensive type hints and validation with Pydantic
  • 🔄 Streaming: Real-time streaming responses for interactive applications
  • 🛡️ Auto SSL: Automatic SSL certificate handling - works out of the box
  • 📊 Multiple Models: Support for all Lexa models (lexa-mml, lexa-x1, lexa-rho)
  • 🔧 Flexible Configuration: Optional SSL and configuration overrides
  • High Performance: Optimized HTTP clients with connection pooling

📦 Installation

pip install lexa

🚀 Quick Start

from lexa_sdk import Lexa

# Initialize the client with your API key
client = Lexa(api_key="your-api-key")

# Simple chat completion
response = client.chat.completions.create(
    model="lexa-mml",
    messages=[
        {"role": "user", "content": "Hello! Tell me a joke."}
    ],
    temperature=0.7,
    max_tokens=100
)

print(response["choices"][0]["message"]["content"])

📚 Available Models

Model Description Context Window Max Tokens Use Case
lexa-mml Multimodal model with vision capabilities 8,192 4,096 General purpose with image understanding
lexa-x1 Fast, lightweight text-based model 4,096 2,048 Quick responses, simple tasks
lexa-rho Reasoning model with enhanced capabilities 16,384 8,192 Complex reasoning, analysis

🔧 Advanced Usage

Async Support

import asyncio
from lexa_sdk import Lexa

async def main():
    client = Lexa(api_key="your-api-key")

    # Async chat completion
    response = await client.chat.completions.acreate(
        model="lexa-mml",
        messages=[{"role": "user", "content": "Explain quantum computing"}],
        temperature=0.3
    )

    print(response["choices"][0]["message"]["content"])

asyncio.run(main())

Streaming Responses

from lexa_sdk import Lexa

client = Lexa(api_key="your-api-key")

# Streaming chat completion
stream = client.chat.completions.create(
    model="lexa-mml",
    messages=[{"role": "user", "content": "Write a short story"}],
    temperature=0.8,
    stream=True
)

for chunk in stream:
    if chunk["choices"][0]["delta"].get("content"):
        print(chunk["choices"][0]["delta"]["content"], end="", flush=True)

Custom SSL Configuration

from lexa_sdk import Lexa

# For environments with SSL issues (not recommended for production)
client = Lexa(
    api_key="your-api-key",
    verify_ssl=False  # ⚠️  Only use if necessary
)

# Or use enhanced SSL (default behavior)
client = Lexa(
    api_key="your-api-key",
    enhanced_ssl=True  # Automatically download and use correct certificates
)

🛠️ API Reference

Client Methods

  • client.chat.completions.create() - Create chat completion
  • client.chat.completions.acreate() - Async chat completion
  • client.models.list() - List available models
  • client.models.alist() - Async list models

Parameters

  • model: Model to use (required)
  • messages: List of messages (required)
  • temperature: Sampling temperature (0.0 to 2.0)
  • max_tokens: Maximum tokens to generate
  • stream: Enable streaming responses
  • top_p: Nucleus sampling parameter
  • frequency_penalty: Frequency penalty
  • presence_penalty: Presence penalty

🔒 Security & SSL

The Lexa SDK automatically handles SSL certificate verification:

  • Default: Uses enhanced SSL with automatic certificate management
  • Fallback: Gracefully falls back to standard SSL verification
  • Manual Override: Allows custom SSL configuration when needed

📖 Documentation

For complete documentation, examples, and API reference, visit:

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Built with ❤️ by Robi Labs
  • Compatible with OpenAI API specifications
  • Powered by Lexa's advanced AI models

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lexa-1.0.2.tar.gz (52.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lexa-1.0.2-py3-none-any.whl (24.3 kB view details)

Uploaded Python 3

File details

Details for the file lexa-1.0.2.tar.gz.

File metadata

  • Download URL: lexa-1.0.2.tar.gz
  • Upload date:
  • Size: 52.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for lexa-1.0.2.tar.gz
Algorithm Hash digest
SHA256 e39081a60268833eecdf9c8a475af6dec720308044f0833630c3a0be09d349b5
MD5 d4efbf9af50dcc713e6003a7dc350c7b
BLAKE2b-256 2f0ee35498afb1ee6cdadab4f694594d9455be726176499adc9a4407169afc56

See more details on using hashes here.

File details

Details for the file lexa-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: lexa-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 24.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for lexa-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 e92e6418b43767a8bb79012e2686acf319fb1c89ffc118f621cb382cbdacf898
MD5 dc85ae1e9ab49a4b8d00353e7a36960e
BLAKE2b-256 cc8a9015369ab9b0524c10016c4f70ec0bd503cd80994e780b62e4b8aa389b8e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page