Skip to main content

Python SDK for Lexa AI - OpenAI-compatible interface for Lexa's language models

Project description

Lexa Python SDK

PyPI version Python 3.8+ License: MIT

A Python SDK for Lexa AI that provides an OpenAI-compatible interface for easy integration with Lexa's language models. Built with automatic SSL configuration and zero-setup installation.

✨ Features

  • 🔗 OpenAI-Compatible: Drop-in replacement for OpenAI SDK
  • 🚀 Async Support: Full async/await support for high-performance applications
  • 📦 Type Safety: Comprehensive type hints and validation with Pydantic
  • 🔄 Streaming: Real-time streaming responses for interactive applications
  • 🛡️ Auto SSL: Automatic SSL certificate handling - works out of the box
  • 📊 Multiple Models: Support for all Lexa models (lexa-mml, lexa-x1, lexa-rho)
  • 🔧 Flexible Configuration: Optional SSL and configuration overrides
  • High Performance: Optimized HTTP clients with connection pooling

📦 Installation

pip install lexa

🚀 Quick Start

from lexa_sdk import Lexa

# Initialize the client with your API key
client = Lexa(api_key="your-api-key")

# Simple chat completion
response = client.chat.completions.create(
    model="lexa-mml",
    messages=[
        {"role": "user", "content": "Hello! Tell me a joke."}
    ],
    temperature=0.7,
    max_tokens=100
)

print(response["choices"][0]["message"]["content"])

📚 Available Models

Model Description Context Window Max Tokens Use Case
lexa-mml Multimodal model with vision capabilities 8,192 4,096 General purpose with image understanding
lexa-x1 Fast, lightweight text-based model 4,096 2,048 Quick responses, simple tasks
lexa-rho Reasoning model with enhanced capabilities 16,384 8,192 Complex reasoning, analysis

🔧 Advanced Usage

Async Support

import asyncio
from lexa_sdk import Lexa

async def main():
    client = Lexa(api_key="your-api-key")

    # Async chat completion
    response = await client.chat.completions.acreate(
        model="lexa-mml",
        messages=[{"role": "user", "content": "Explain quantum computing"}],
        temperature=0.3
    )

    print(response["choices"][0]["message"]["content"])

asyncio.run(main())

Streaming Responses

from lexa_sdk import Lexa

client = Lexa(api_key="your-api-key")

# Streaming chat completion
stream = client.chat.completions.create(
    model="lexa-mml",
    messages=[{"role": "user", "content": "Write a short story"}],
    temperature=0.8,
    stream=True
)

for chunk in stream:
    if chunk["choices"][0]["delta"].get("content"):
        print(chunk["choices"][0]["delta"]["content"], end="", flush=True)

Custom SSL Configuration

from lexa_sdk import Lexa

# For environments with SSL issues (not recommended for production)
client = Lexa(
    api_key="your-api-key",
    verify_ssl=False  # ⚠️  Only use if necessary
)

# Or use enhanced SSL (default behavior)
client = Lexa(
    api_key="your-api-key",
    enhanced_ssl=True  # Automatically download and use correct certificates
)

🛠️ API Reference

Client Methods

  • client.chat.completions.create() - Create chat completion
  • client.chat.completions.acreate() - Async chat completion
  • client.models.list() - List available models
  • client.models.alist() - Async list models

Parameters

  • model: Model to use (required)
  • messages: List of messages (required)
  • temperature: Sampling temperature (0.0 to 2.0)
  • max_tokens: Maximum tokens to generate
  • stream: Enable streaming responses
  • top_p: Nucleus sampling parameter
  • frequency_penalty: Frequency penalty
  • presence_penalty: Presence penalty

🔒 Security & SSL

The Lexa SDK automatically handles SSL certificate verification:

  • Default: Uses enhanced SSL with automatic certificate management
  • Fallback: Gracefully falls back to standard SSL verification
  • Manual Override: Allows custom SSL configuration when needed

📖 Documentation

For complete documentation, examples, and API reference, visit:

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Built with ❤️ by Robi Labs
  • Compatible with OpenAI API specifications
  • Powered by Lexa's advanced AI models

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lexa-1.0.1.tar.gz (52.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lexa-1.0.1-py3-none-any.whl (24.3 kB view details)

Uploaded Python 3

File details

Details for the file lexa-1.0.1.tar.gz.

File metadata

  • Download URL: lexa-1.0.1.tar.gz
  • Upload date:
  • Size: 52.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for lexa-1.0.1.tar.gz
Algorithm Hash digest
SHA256 aaf5a61296e66370aee3db399a7c62b610b640750d5ae64a0d6bcd71e58e955c
MD5 68444556d15fdb8a80aa2a24cd4d8938
BLAKE2b-256 63c6d5f39c464c4051db8918768449195e06892969d71486409434387a4b7c65

See more details on using hashes here.

File details

Details for the file lexa-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: lexa-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 24.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for lexa-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0138fc330b3e8bd69c3e3c1b06bad7a48af604e2c67fd7796f6c04551135a581
MD5 f0ac98649cdb54e07ea620c97f3b5128
BLAKE2b-256 17227c6daa7dc60976dea5addc2d535ec15916558c33b72c0a7b60d523ada407

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page