Skip to main content

No project description provided

Project description

LLMstudio by TensorOps

Prompt Engineering at your fingertips

LLMstudio logo

🌟 Features

LLMstudio UI

  • LLM Proxy Access: Seamless access to all the latest LLMs by OpenAI, Anthropic, Google.
  • Custom and Local LLM Support: Use custom or local open-source LLMs through Ollama.
  • Prompt Playground UI: A user-friendly interface for engineering and fine-tuning your prompts.
  • Python SDK: Easily integrate LLMstudio into your existing workflows.
  • Monitoring and Logging: Keep track of your usage and performance for all requests.
  • LangChain Integration: LLMstudio integrates with your already existing LangChain projects.
  • Batch Calling: Send multiple requests at once for improved efficiency.
  • Smart Routing and Fallback: Ensure 24/7 availability by routing your requests to trusted LLMs.
  • Type Casting (soon): Convert data types as needed for your specific use case.

🚀 Quickstart

Don't forget to check out https://docs.llmstudio.ai page.

Installation

Install the latest version of LLMstudio using pip. We suggest that you create and activate a new environment using conda

pip install llmstudio

Install bun if you want to use the UI

curl -fsSL https://bun.sh/install | bash

Create a .env file at the same path you'll run LLMstudio

OPENAI_API_KEY="sk-api_key"
ANTHROPIC_API_KEY="sk-api_key"

Now you should be able to run LLMstudio Proxy using the following command.

llmstudio server --proxy

When the --proxy flag is set, you'll be able to access the Swagger at http://0.0.0.0:50001/docs (default port)

📖 Documentation

👨‍💻 Contributing

  • Head on to our Contribution Guide to see how you can help LLMstudio.
  • Join our Discord to talk with other LLMstudio enthusiasts.

Training

Banner


Thank you for choosing LLMstudio. Your journey to perfecting AI interactions starts here.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmstudio_proxy-1.0.1.tar.gz (5.8 kB view details)

Uploaded Source

Built Distribution

llmstudio_proxy-1.0.1-py3-none-any.whl (7.0 kB view details)

Uploaded Python 3

File details

Details for the file llmstudio_proxy-1.0.1.tar.gz.

File metadata

  • Download URL: llmstudio_proxy-1.0.1.tar.gz
  • Upload date:
  • Size: 5.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.12.7 Linux/6.5.0-1025-azure

File hashes

Hashes for llmstudio_proxy-1.0.1.tar.gz
Algorithm Hash digest
SHA256 ef52abb2f08ecd82e4c5a7221ba396dd08a3a72251d8529a4ef529df3a77a662
MD5 253651f4ee5a49abe6e45007bdda2164
BLAKE2b-256 493f89a71bb59362575d5d1b7273046078add254ce99dac7c3568a2285590b5c

See more details on using hashes here.

File details

Details for the file llmstudio_proxy-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: llmstudio_proxy-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 7.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.12.7 Linux/6.5.0-1025-azure

File hashes

Hashes for llmstudio_proxy-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8fb51105c9debb810d2f211636e91d0bd7904d1f9c771229143c4ec56ea8e327
MD5 168e5e9ae0e140be76e8af1d9739cf0e
BLAKE2b-256 a0692a68c32cba73bdabb40b66a2004a601b75ac6392dc338e2c3b92a2e38ea4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page