Skip to main content

Prompt Perfection at Your Fingertips

Project description

LLMstudio by TensorOps

Prompt Engineering at your fingertips

LLMstudio logo

[!IMPORTANT] LLMstudio is now supporting OpenAI v1.0 + just added support to Anthropic

🌟 Features

LLMstudio UI

  1. Python Client Gateway:
    • Access models from known providers such as OpenAI, VertexAI and Bedrock. All in one platform.
    • Speed up development with tracking and robustness features from LLMstudio.
    • Continue using popular libraries like LangChain through their LLMstudio-wrapped versions.
  2. Prompt Editing UI:
    • An intuitive interface designed for prompt engineering.
    • Quickly iterate between prompts until you reach your desired results.
    • Access the history of your previous prompts and their results.
  3. History Management:
    • Track past runs, available for both on the UI and the Client.
    • Log the cost, latency and output of each prompt.
    • Export the history to a CSV.
  4. Context Limit Adaptability:
    • Automatic switch to a larger-context fallback model if the current model's context limit is exceeded.
    • Always use the lowest context model and only use the higher context ones when necessary to save costs.
    • For instance, exceeding 4k tokens in gpt-3.5-turbo triggers a switch to gpt-3.5-turbo-16k.

👀 Coming soon:

  • Side-by-side comparison of multiple LLMs using the same prompt.
  • Automated testing and validation for your LLMs. (Create Unit-tests for your LLMs which are evaluated automatically)
  • API key administration. (Define quota limits)
  • Projects and sessions. (Organize your History and API keys by project)
  • Resilience against service provider rate limits.
  • Organized tracking of groups of related prompts (Chains, Agents)

🚀 Quickstart

Don't forget to check out https://docs.llmstudio.ai page.

Installation

Install the latest version of LLMstudio using pip. We suggest that you create and activate a new environment using conda

pip install llmstudio

Install bun if you want to use the UI

curl -fsSL https://bun.sh/install | bash

Create a .env file at the same path you'll run LLMstudio

OPENAI_API_KEY="sk-api_key"
ANTHROPIC_API_KEY="sk-api_key"

Now you should be able to run LLMstudio using the following command.

llmstudio server --ui

When the --ui flag is set, you'll be able to access the UI at http://localhost:3000

🤔 About LLMstudio

Powered by TensorOps, LLMstudio redefines your experience with OpenAI, Vertex Ai and more language model providers. More than just a tool, it’s an evolving environment where teams can experiment, modify, and optimize their interactions with advanced language models.

Benefits include:

  • Streamlined Prompt Engineering: Simplify and enhance your prompt design process.
  • Execution History: Keep a detailed log of past executions, track progress, and make iterative improvements effortlessly.
  • Effortless Data Export: Share your team's endeavors by exporting data to shareable CSV files.

Step into the future of AI with LLMstudio, by watching our introduction video

📖 Documentation

👨‍💻 Contributing

  • Head on to our Contribution Guide to see how you can help LLMstudio.
  • Join our Discord to talk with other LLMstudio enthusiasts.

Training

Banner


Thank you for choosing LLMstudio. Your journey to perfecting AI interactions starts here.

Project details


Release history Release notifications | RSS feed

This version

0.3.6

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmstudio-0.3.6.tar.gz (111.3 kB view details)

Uploaded Source

Built Distribution

llmstudio-0.3.6-py3-none-any.whl (160.7 kB view details)

Uploaded Python 3

File details

Details for the file llmstudio-0.3.6.tar.gz.

File metadata

  • Download URL: llmstudio-0.3.6.tar.gz
  • Upload date:
  • Size: 111.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.3 Linux/6.5.0-1021-azure

File hashes

Hashes for llmstudio-0.3.6.tar.gz
Algorithm Hash digest
SHA256 0e02561887ced76bef7deb7b4740ac7d4cfe01a7f2d81fa6c5aeada2dc6e4746
MD5 f683e9ca6ce4e1f43b44f7f9e6b24f94
BLAKE2b-256 613168235577e0cf9ca0f391dc534d30e77177ecfcabd9246c3444623305eb91

See more details on using hashes here.

File details

Details for the file llmstudio-0.3.6-py3-none-any.whl.

File metadata

  • Download URL: llmstudio-0.3.6-py3-none-any.whl
  • Upload date:
  • Size: 160.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.3 Linux/6.5.0-1021-azure

File hashes

Hashes for llmstudio-0.3.6-py3-none-any.whl
Algorithm Hash digest
SHA256 9a6d5893a927b22d14c99459d83a1f4eacc2546564164e86763e3e6f3fd675c1
MD5 6c7def3eaa17f2d0f7f07075ac8e97b4
BLAKE2b-256 531e03fea32636f1b9fea950bd66d4fec22fe94269e31d8e735999f2e5e96afd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page