Humanable ChatGPT/GLM Fine-tuning.
Project description
使用
准备数据
每一行一个json,必须包含prompt
和completion
两个字段。示例如下:
{"prompt": "问题:你是谁?\n", "completion": "不告诉你。"},
正常微调
# 微调
from hchg.sft import GlmLora
data_path = "/path/to/json_data/"
model_id = "THUDM/chatglm-6b"
device = "cuda:0"
model = GlmLora(model_id, device=device)
model.load_data(data_path).tune(device)
# 推理
from hchg.sft import GlmLora
model_id = "THUDM/chatglm-6b"
device = "cuda:0"
model = GlmLora(model_id, device=device)
model.load_pretrained("/path/to/lora_pt").eval()
inp = "你是谁?"
model.chat(inp)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
hcgf-0.0.1.tar.gz
(5.5 kB
view details)
Built Distribution
hcgf-0.0.1-py3-none-any.whl
(5.7 kB
view details)
File details
Details for the file hcgf-0.0.1.tar.gz
.
File metadata
- Download URL: hcgf-0.0.1.tar.gz
- Upload date:
- Size: 5.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.8.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 35ac52804870b73eb0a8be97e9479977382714b3442172e8029c283793048b1c |
|
MD5 | 5bfe9bf715edab4b792474645033b5ee |
|
BLAKE2b-256 | bb7573bd939b08c9d17ffce73c18ba3086ee54a63876b9c567ce435f53bc90c8 |
Provenance
File details
Details for the file hcgf-0.0.1-py3-none-any.whl
.
File metadata
- Download URL: hcgf-0.0.1-py3-none-any.whl
- Upload date:
- Size: 5.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.8.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 62bc57ca544b641b74f81c7230877dba1eacc0ef8a1e2bbbc3a01648115b688a |
|
MD5 | b2ca778af9776648ca841f619c74036f |
|
BLAKE2b-256 | b31de39f40db1853c0f5d13af25529a63173f1918b765bb3b3a9a7faf61b6984 |