Skip to main content

Create a Python package.

Project description

image image image

🔥ChatLLM 基于知识库🔥

Install

pip install -U chatllm

Docs

Usages

from chatllm.applications import ChatBase

qa = ChatBase()
qa.load_llm4chat(model_name_or_path="THUDM/chatglm-6b")

for i, _ in qa(query='周杰伦是谁', knowledge_base='周杰伦是傻子'):
    pass
# 根据已知信息无法回答该问题,因为周杰伦是中国内地流行歌手、演员、音乐制作人、导演,
# 是具有一定的知名度和专业能力的人物,没有提供足够的信息无法判断他是傻子。
Click to ChatPDF
from chatllm.applications.chatpdf import ChatPDF

qa = ChatPDF(encode_model='nghuyong/ernie-3.0-nano-zh')
qa.load_llm4chat(model_name_or_path="THUDM/chatglm-6b")

for i, _ in qa(query='东北证券主营业务'):
    pass
# 根据已知信息,东北证券的主营业务为证券业务。公司作为证券公司,主要从事证券经纪、证券投资咨询、与证券交易、
# 证券投资活动有关的财务顾问、证券承销与保荐、证券自营、融资融券、证券投资基金代销和代销金融产品待业务。

一键启动 webui chatllm-run webui --name chatpdf

向量召回结果

Click to 开发部署
  • ChatGLM-6B 模型硬件需求

    量化等级 最低 GPU 显存(推理) 最低 GPU 显存(高效参数微调)
    FP16(无量化) 13 GB 14 GB
    INT8 8 GB 9 GB
    INT4 6 GB 7 GB
  • Embedding 模型硬件需求

    本项目中默认选用的 Embedding 模型 GanymedeNil/text2vec-large-chinese 约占用显存 3GB,也可修改为在 CPU 中运行。

软件需求

本项目已在 Python 3.8 - 3.10,CUDA 11.7 环境下完成测试。已在 Windows、ARM 架构的 macOS、Linux 系统中完成测试。

从本地加载模型

请参考 THUDM/ChatGLM-6B#从本地加载模型

ChatGLM-6B Mac 本地部署实操记录

1. 安装环境

参见 安装指南

Click to TODO
  • 增加一键启动 webui

    • chatllm-run webui --name chatpdf
  • 增加ChatPDF

  • 增加本地知识库组件

  • 增加互联网搜索组件

  • 增加知识图谱组件

  • 增加微调模块

  • 增加流式输出

  • 增加http接口

  • 增加grpc接口

======= History

0.0.0 (2023-04-11)

  • First release on PyPI.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ChatLLM-2023.4.26.20.32.6.tar.gz (7.7 MB view details)

Uploaded Source

Built Distribution

ChatLLM-2023.4.26.20.32.6-py3-none-any.whl (27.9 kB view details)

Uploaded Python 3

File details

Details for the file ChatLLM-2023.4.26.20.32.6.tar.gz.

File metadata

  • Download URL: ChatLLM-2023.4.26.20.32.6.tar.gz
  • Upload date:
  • Size: 7.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.13

File hashes

Hashes for ChatLLM-2023.4.26.20.32.6.tar.gz
Algorithm Hash digest
SHA256 8158a1acbaf7030210b3a77e083436bdebff049f7d7d1ab0859c026a20f6a15a
MD5 80391aefcc98a66e5eabcfa85e9fd86a
BLAKE2b-256 b12bfa1f26cfe81ef144c93b979bc2eb38091946a3db434ac0543c7e50a88b6c

See more details on using hashes here.

File details

Details for the file ChatLLM-2023.4.26.20.32.6-py3-none-any.whl.

File metadata

File hashes

Hashes for ChatLLM-2023.4.26.20.32.6-py3-none-any.whl
Algorithm Hash digest
SHA256 c0bc41ee8aeb488b80d3e71b589f6aa3039ff50389d119c76c79e4ffbe330a84
MD5 042062c7f4ce684314b0e174e28f89f4
BLAKE2b-256 606d66268c839218dbe04c53ce77065ece33d1089761748ed8d7ab62cd01f8e1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page