Skip to main content

The open-source yshs library.

Project description

Here is the open library of Yuan Shang Han Shan (YSHS) co. ltd.

Install

pip install yshs

Update logs

  • 20240617, v1.1.0 支持OpenAI格式

Usage

List Models

列出所有可用的模型

import os, sys
import yshs
yshs.api_key = os.getenv('YSHS_API_KEY')

response = yshs.Models.list(refresh=True, return_all_info=True)
print(response)

Reuqest Model via OpenAI Client

from openai import OpenAI

base_url = 'https://www.yshs.vip/v1'
api_key = os.getenv("YSHS_API_KEY")
model = "openai/gpt-3.5-turbo"
messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "hello"},  
]

clent = OpenAI(
    base_url='https://www.yshs.vip/v1',
    api_key=os.getenv("YSHS_API_KEY")
)

rst = client.chat.completions.create(
    model=model,
    messages=messages,
    stream=True)

full_response = ''
for chunk in res:
    x = chunk.choices[0].delta.content
    if x:
        full_response += x
        print(x, end='', flush=True)
print()  

Request AI Model

import os, sys
import yshs
yshs.api_key = os.getenv('YSHS_API_KEY')

def request_model():
    responese = yshs.LLM.chat(
        model="openai/gpt-3.5-turbo",  # 选择模型
        messages=[
          {"role": "system", "content": "You are a helpful assistant."},  # 系统提示
          {"role": "user", "content": "Who won the world series in 2020?"},  # 第一个问题
          # {"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},  # 第一个的答案
          # {"role": "user", "content": "Where was it played?"}  # 第二个问题
      ]
    )

    full_response = ""
    for x in responese:
        sys.stdout.write(x)  # 逐token输出
        sys.stdout.flush()
        full_response += x
    print()
    return full_response

answer = request_model()
# print(answer)

Continuous Conversation

Create thread to continue the conversation automatically.

from yshs import Client

client = Client()  # 注;一个Client可包含多个ChatThead, 一个ChatThead对应一个Conversation(根据chat_id区分),一个Conversation有多个轮次(turns)

prompt = "hello"  # user prompt
for chunk in client.send_prompt(prompt, chat_id=None):  # send_prompt()时自动创建ChatThead
    print(chunk['response'], end='', flush=True)
print()
chat_id = chunk['chat_id']
print(f'chat_id: {chat_id}')

prompt = 'who are you?'
for chunk in client.send_prompt(prompt, chat_id=chat_id):  # send_prompt()时自动匹配ChatThead,若不存在则创建
    print(chunk['response'], end='', flush=True)
print()
print(f'chat_id: {chat_id}')

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

yshs-1.1.0.tar.gz (8.6 kB view hashes)

Uploaded Source

Built Distribution

yshs-1.1.0-py2.py3-none-any.whl (7.9 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page