Skip to main content

Python SDK for Lexa AI - OpenAI-compatible interface for Lexa's language models

Project description

Lexa Python SDK

PyPI version Python 3.8+ License: MIT

A Python SDK for Lexa AI that provides an OpenAI-compatible interface for easy integration with Lexa's language models. Built with automatic SSL configuration and zero-setup installation.

✨ Features

  • 🔗 OpenAI-Compatible: Drop-in replacement for OpenAI SDK
  • 🚀 Async Support: Full async/await support for high-performance applications
  • 📦 Type Safety: Comprehensive type hints and validation with Pydantic
  • 🔄 Streaming: Real-time streaming responses for interactive applications
  • 🛡️ Auto SSL: Automatic SSL certificate handling - works out of the box
  • 📊 Multiple Models: Support for all Lexa models (lexa-mml, lexa-x1, lexa-rho)
  • 🔧 Flexible Configuration: Optional SSL and configuration overrides
  • High Performance: Optimized HTTP clients with connection pooling

📦 Installation

pip install lexa

🚀 Quick Start

from lexa_sdk import Lexa

# Initialize the client with your API key
client = Lexa(api_key="your-api-key")

# Simple chat completion
response = client.chat.completions.create(
    model="lexa-mml",
    messages=[
        {"role": "user", "content": "Hello! Tell me a joke."}
    ],
    temperature=0.7,
    max_tokens=100
)

print(response["choices"][0]["message"]["content"])

📚 Available Models

Model Description Context Window Max Tokens Use Case
lexa-mml Multimodal model with vision capabilities 8,192 4,096 General purpose with image understanding
lexa-x1 Fast, lightweight text-based model 4,096 2,048 Quick responses, simple tasks
lexa-rho Reasoning model with enhanced capabilities 16,384 8,192 Complex reasoning, analysis

🔧 Advanced Usage

Async Support

import asyncio
from lexa_sdk import Lexa

async def main():
    client = Lexa(api_key="your-api-key")

    # Async chat completion
    response = await client.chat.completions.acreate(
        model="lexa-mml",
        messages=[{"role": "user", "content": "Explain quantum computing"}],
        temperature=0.3
    )

    print(response["choices"][0]["message"]["content"])

asyncio.run(main())

Streaming Responses

from lexa_sdk import Lexa

client = Lexa(api_key="your-api-key")

# Streaming chat completion
stream = client.chat.completions.create(
    model="lexa-mml",
    messages=[{"role": "user", "content": "Write a short story"}],
    temperature=0.8,
    stream=True
)

for chunk in stream:
    if chunk["choices"][0]["delta"].get("content"):
        print(chunk["choices"][0]["delta"]["content"], end="", flush=True)

Custom SSL Configuration

from lexa_sdk import Lexa

# For environments with SSL issues (not recommended for production)
client = Lexa(
    api_key="your-api-key",
    verify_ssl=False  # ⚠️  Only use if necessary
)

# Or use enhanced SSL (default behavior)
client = Lexa(
    api_key="your-api-key",
    enhanced_ssl=True  # Automatically download and use correct certificates
)

🛠️ API Reference

Client Methods

  • client.chat.completions.create() - Create chat completion
  • client.chat.completions.acreate() - Async chat completion
  • client.models.list() - List available models
  • client.models.alist() - Async list models

Parameters

  • model: Model to use (required)
  • messages: List of messages (required)
  • temperature: Sampling temperature (0.0 to 2.0)
  • max_tokens: Maximum tokens to generate
  • stream: Enable streaming responses
  • top_p: Nucleus sampling parameter
  • frequency_penalty: Frequency penalty
  • presence_penalty: Presence penalty

🔒 Security & SSL

The Lexa SDK automatically handles SSL certificate verification:

  • Default: Uses enhanced SSL with automatic certificate management
  • Fallback: Gracefully falls back to standard SSL verification
  • Manual Override: Allows custom SSL configuration when needed

📖 Documentation

For complete documentation, examples, and API reference, visit:

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Built with ❤️ by Robi Labs
  • Compatible with OpenAI API specifications
  • Powered by Lexa's advanced AI models

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lexa-1.0.3.tar.gz (52.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lexa-1.0.3-py3-none-any.whl (24.4 kB view details)

Uploaded Python 3

File details

Details for the file lexa-1.0.3.tar.gz.

File metadata

  • Download URL: lexa-1.0.3.tar.gz
  • Upload date:
  • Size: 52.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for lexa-1.0.3.tar.gz
Algorithm Hash digest
SHA256 56a357e4581c17da8e252adcb7171ee1583e17ef07dc6e442c7a1cb0f0c9a589
MD5 0c84577e6e7ea589f17144f3a919da8d
BLAKE2b-256 8a114d047f880fa054b5e7ac58a2cff5072f22cb7f1f35b79d6650b1af75c92a

See more details on using hashes here.

File details

Details for the file lexa-1.0.3-py3-none-any.whl.

File metadata

  • Download URL: lexa-1.0.3-py3-none-any.whl
  • Upload date:
  • Size: 24.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for lexa-1.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 01d89e741eeabd9b6559ba6dac45697e5d8c80ba958983ecad49eb4d51caf38c
MD5 4003d1f6498f1cbb0ec1c8d55f0fa35d
BLAKE2b-256 e80bac78adc3a9d78b202489a14e1c845caa462d6896122ff92b915752c7d0fc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page