Skip to main content

Humanable ChatGPT/GLM Fine-tuning.

Project description

使用

准备数据

每一行一个json,必须包含promptcompletion两个字段。示例如下:

{"prompt": "问题:你是谁?\n", "completion": "不告诉你。"},

正常微调

# 微调
from hchg.sft import GlmLora
data_path = "/path/to/json_data/"
model_id = "THUDM/chatglm-6b"
device = "cuda:0"
model = GlmLora(model_id, device=device)
model.load_data(data_path).tune(device)

# 推理
from hchg.sft import GlmLora
model_id = "THUDM/chatglm-6b"
device = "cuda:0"
model = GlmLora(model_id, device=device)
model.load_pretrained("/path/to/lora_pt").eval()
inp = "你是谁?"
model.chat(inp)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hcgf-0.0.2.tar.gz (36.0 kB view hashes)

Uploaded Source

Built Distribution

hcgf-0.0.2-py3-none-any.whl (38.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page