Humanable ChatGPT/GLM Fine-tuning.
Project description
使用
准备数据
每一行一个json,必须包含prompt
和completion
两个字段。示例如下:
{"prompt": "问题:你是谁?\n", "completion": "不告诉你。"},
正常微调
# 微调
from hchg.sft import GlmLora
data_path = "/path/to/json_data/"
model_id = "THUDM/chatglm-6b"
device = "cuda:0"
model = GlmLora(model_id, device=device)
model.load_data(data_path).tune(device)
# 推理
from hchg.sft import GlmLora
model_id = "THUDM/chatglm-6b"
device = "cuda:0"
model = GlmLora(model_id, device=device)
model.load_pretrained("/path/to/lora_pt").eval()
inp = "你是谁?"
model.chat(inp)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
hcgf-0.0.2.tar.gz
(36.0 kB
view hashes)
Built Distribution
hcgf-0.0.2-py3-none-any.whl
(38.4 kB
view hashes)