Skip to main content

A lightweight Python client for interacting with LM Studio API.

Project description

LMStudio_Client

LMStudio_Client is a simple Python wrapper for the LMStudio API. Currently, it provides access to the chat/completions endpoint, allowing for both blocking and streaming interactions with an LLM running in LMStudio.

Features

  • Supports synchronous (blocking) chat responses.
  • Supports streamed responses for real-time interaction.
  • Easy-to-use interface for interacting with LMStudio.

Installation

pip install lmstudio_client  # (if available as a package)

Or clone this repository and install dependencies manually:

git clone https://github.com/eiredynamic/lmstudio-client-python
cd lmstudio-client-python

Ideally also using a venv...

python -m venv ./venv

Install the dependencies

pip install -r requirements.txt

...and install the module locally

pip install -e .

Usage

A demonstration of how to use the client is available in example.py, a simple console-based application.

Main Functions

def chat(self, usr_prompt, sys_prompt=_sys_prompt, endpoint_root=_endpoint_root, include_reasoning=False):
    """
    Sends a user prompt to the LMStudio API and returns the complete response.
    
    :param usr_prompt: The user input text.
    :param sys_prompt: (Optional) System prompt to guide the model.
    :param endpoint_root: (Optional) Root endpoint of the API.
    :param include_reasoning: (Optional) Whether to include model reasoning.
    :return: The full response from the LLM.
    """
def stream_chat(self, usr_prompt, sys_prompt=_sys_prompt, endpoint_root=_endpoint_root, include_reasoning=False):
    """
    Sends a user prompt to the LMStudio API and returns a streamed response.
    
    :param usr_prompt: The user input text.
    :param sys_prompt: (Optional) System prompt to guide the model.
    :param endpoint_root: (Optional) Root endpoint of the API.
    :param include_reasoning: (Optional) Whether to include model reasoning.
    :yield: Streamed response chunks from the LLM.
    """

Example Usage

Blocking Call

from lmstudio_client.client import Client

client = Client()
response = client.chat("Tell me a joke.")
print(response)

Streaming Call

response_stream = client.stream_chat("Tell me a joke.")
for chunk in response_stream:
    print(chunk, end="", flush=True)

Project Repo

https://github.com/eiredynamic/lmstudio-client-python

License

This project is licensed under the MIT License. See LICENSE for details.

Contributions

Contributions are welcome! Feel free to submit a pull request or report issues.

Acknowledgments

Thanks to LMStudio for providing the API interface.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lmstudio_client-0.1.0.tar.gz (4.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lmstudio_client-0.1.0-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file lmstudio_client-0.1.0.tar.gz.

File metadata

  • Download URL: lmstudio_client-0.1.0.tar.gz
  • Upload date:
  • Size: 4.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for lmstudio_client-0.1.0.tar.gz
Algorithm Hash digest
SHA256 07fbdd15e65af1594e30538ae8047378ddd69873ee66bf217be87494c1a8fa40
MD5 0176a291fe2205320f1bd260375f28f7
BLAKE2b-256 b4000792a5e40a4ecfdeff25d6bf379a0d68dc259c8fc7cf272b3f741b668437

See more details on using hashes here.

File details

Details for the file lmstudio_client-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for lmstudio_client-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 183f17fd1b6d2ecb68fa940696479b1f1c3279419d5a2c2eb7b0162c59710cf3
MD5 c3f7bea3d00f5cb72c41eeed1290a0c0
BLAKE2b-256 d640a4b9328c08a2044b5c069e55b4be5460668bf6548e9965d22bf0318291f7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page