Skip to main content

A local implementation of OpenAI Assistants API: Myla stands for MY Local Assistant

Project description

English | 简体中文

Myla: MY Local Assistants

Self-hosting AI Assistants compatible with OpenAI

Myla stands for MY Local Assistants and is designed and optimized for deploying AI assistants based on large language models (LLMs) in a private environment. Myla provides an API compatible with the OpenAI assistants API, with support for multiple LLM backends. Whether on a laptop or a production server, you can quickly develop and run AI assistants.

Key Features:

  • Support for OpenAI API and compatible LLM services
  • Assistant API compatible with OpenAI
  • Vector retrieval (FAISS/LanceDB)
  • sentence_transformers
  • WebUI
  • Tool extensions
  • Document Q&A (in progress)

Quick Start

Installation

Python version requirement: >= 3.9

Myla can be installed from PyPI using pip. It is recommended to create a new virtual environment before installation to avoid conflicts.

pip install myla

If you need Retrieval, please install all dependencies:

pip install "myla[all]"

Configuration

Myla supports using an OpenAI API-compatible LLM service as the backend. You can use the OpenAI API directly or deploy your own local LLM. If you want to deploy a local LLM, it is recommended to use Xorbits Inference.

Create a .env file in the current directory with the following content:

# LLM configuration
LLM_ENDPOINT=https://api.openai.com/v1/
LLM_API_KEY=sk-xx
DEFAULT_LLM_MODEL_NAME=gpt-3.5-turbo

More configurations can be found in: env-example.txt

ChatGLM as backend for your MacBook

Myla supports running ChatGLM locally using chatglm.cpp as the backend. To install the Python Binding, refer to: https://github.com/li-plus/chatglm.cpp#python-binding

.env configuration example:

DEFAULT_LLM_MODEL_NAME=chatglm@/Users/shellc/Workspaces/chatglm.cpp/chatglm-ggml.bin

Start

myla

or

python -m myla

For more startup options:

myla --help

WebUI

Myla provides a simple web interface that makes it easy to develop and debug assistants.

Access from your browser: http://localhost:2000/

Screenshot

API

You can directly use the OpenAI python SDK to access the assistants API.

Community

Myla is still under rapid development, and community contributions are welcome.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

myla-0.2.40-py3-none-any.whl (588.5 kB view details)

Uploaded Python 3

File details

Details for the file myla-0.2.40-py3-none-any.whl.

File metadata

  • Download URL: myla-0.2.40-py3-none-any.whl
  • Upload date:
  • Size: 588.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.7

File hashes

Hashes for myla-0.2.40-py3-none-any.whl
Algorithm Hash digest
SHA256 daf541cce63fbe303de8cee96d1c1994ecd7dcc1fd9ba883f191b95835ba599b
MD5 9ce735d91939759295d5fffc00799b0e
BLAKE2b-256 363e2ba7b674cab48b73023619fd43be82376615d55a913bc11a00cec03b4840

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page