Skip to main content

Run a socket server for AI models.

Project description

RunAI

Run AI allows you to run a LLMs using a socket server.


Features

  • Offline friendly - works completely locally with no internet connection (must first download models)
  • Sockets: handles byte packets of an arbitrary size
  • Threaded: asynchronously handle requests and responses
  • Queue: requests and responses are handed off to a queue

Limitations

Data between server and client is not encrypted

This only matters if someone wants to create a production ready version of this server which would be hosted on the internet. This server is not designed for that purpose. It was designed with a single use-case in mind: the ability to run Stable Diffusion (and other AI models) locally. It was designed for use with the Krita Stable Diffusion plugin, but can work with any interface provided someone writes a client for it.

Only works with Mistral

This library was designed to work with the Mistral model, but it can be expanded to work with any LLM.


Installation

pip install runai
cp src/runai/default.settings.py src/runai/settings.py

Modify settings.py as you see fit.


Run server and client

See src/runai/server.py for an example of how to run the server and src/runai/client.py for an example of how to run the client. Both of these files can be run directly from the command line.

The socket client will continuously attempt to connect to the server until it is successful. The server will accept connections from any client on the given port.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airunner_nexus-1.4.4.tar.gz (14.9 kB view details)

Uploaded Source

Built Distribution

airunner_nexus-1.4.4-py3-none-any.whl (14.6 kB view details)

Uploaded Python 3

File details

Details for the file airunner_nexus-1.4.4.tar.gz.

File metadata

  • Download URL: airunner_nexus-1.4.4.tar.gz
  • Upload date:
  • Size: 14.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.19

File hashes

Hashes for airunner_nexus-1.4.4.tar.gz
Algorithm Hash digest
SHA256 aa6d49b863fd5e7211e9dbc04465c545e45fd0d93333f80eb5caf3259110bd1a
MD5 bef5560807eba112f635df3d0c26706b
BLAKE2b-256 180d024987fc09e382158e79f976b38e8545e6a4d6c7f26a5f950113553367d2

See more details on using hashes here.

File details

Details for the file airunner_nexus-1.4.4-py3-none-any.whl.

File metadata

File hashes

Hashes for airunner_nexus-1.4.4-py3-none-any.whl
Algorithm Hash digest
SHA256 ff75217da4dfa794cad006f0022ed6c384045cf32963bfed459d88006558d5a1
MD5 0dbecedee80489cf6cb6a81498b5b672
BLAKE2b-256 ace46422e6b92c92a987ce0681d02d53d80c6060eb3497eebd9cfa398241f5e3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page