Skip to main content

A Python library for interacting with various language model APIs

Project description

This is a personal library developed by Qichang Zheng.

Installation

pip install qichang

LLM Conversation

For single conversation

import qichang
llm = qichang.LLM_API()
llm.chat('GPT3.5', 'Hello, how are you?')
llm.chat('GPT4', 'Hello, how are you?')

For multi-turn conversation

We currently have two servers in Virginia and Singapore. This function will automatically choose the server with the lowest latency, but you can also manually set the server.

import qichang
llm = qichang.LLM_API()
# Manually set the server
# llm.server = 'Virginia'
# llm.server = 'Singapore'
# Here we need to specify the chatID (string) to distinguish different conversations
llm.chat('GPT3.5', 'Hello, how are you?', 'ChatID')
llm.chat('GPT4', 'What did I just asked?', 'ChatID')

Model Downloader

This is a tool to download the huggingface models in China. Note that this function only works for some models, the author is working on further improvement.

import qichang
downloader = qichang.Model_Downloader()
downloader.download('Qwen/Qwen-7B-Chat', 'test') # Download the model to the folder 'test'

Davinci Embedding

This is a tool to get the embedding of the text from the Davinci model.

import os
import qichang

os.environ["OPENAI_API_KEY"] = "your_api_key"

Embedder = qichang.Embedder()
Embedder.embedding('Hello, how are you?')

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qichang-0.0.52.tar.gz (10.3 kB view hashes)

Uploaded Source

Built Distribution

qichang-0.0.52-py3-none-any.whl (11.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page