Skip to main content

Chinese Generative Pre-Training Transformer

Project description

Chinese-GPT 中文GPT预训练模型

Chinese Generative Pre-Training(GPT) Language Model

This project is unidirectional transformer GPT model (117M) trained on a large corpus dataset following the approach OpenAI GPT-2. Due to limited computational resources, we did not train our model from scratch. Instead, we take the advantage of BERT and use its weights as initialization to train our Chinese GPT. This makes the training possible on 4 x 1080Ti.

However, please notice that currently the performance still cannot match the original English GPT-2 model for various reasons. This can be that OpenAI has done better text filtering and has a dataset with better quality. Also, they have trained their model for about 300 GPU days at least. But the model here can be a good starting point if you want to apply it for substream tasks.

Installation

Before using it, you might want to install the requirements first.

pip install -r requirements.txt

You can also install it via pip.

pip install chinese-gpt

Usage

Check tutorials for details.

Data Preparation

To train GPT, it requires a dataset from a wide range of sources.

We collected data from NLP Chinese Corpus

In details, we used:

  • 社区问答json版(webtext2019zh) :大规模高质量数据集
  • 百科类问答json版(baike2018qa)
  • 新闻语料json版(news2016zh)

Text Filtering

One thing to take care of is that text filtering. Since Bert Chinese tokenizer doesn't include some punctuations. You might want to use the following code to clean your data first:

import regex as re

def filterPunctuation(x):
    x = re.sub(r'[‘’]', "'", x)
    x = re.sub(r'[“”]', '"', x)
    x = re.sub(r'[…]', '...', x)
    x = re.sub(r'[—]', '-', x)
    x = re.sub(r"&nbsp", "", x)
    return x

You may also want to convert traditional Chinese to simplified Chinese and apply some other filtering techniques based on your data.

Reference

OpenAI GPT-2

BERT

AllenNLP

Pytorch BERT

NLP Chinese Corpus

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chinese_gpt-0.1.2.tar.gz (6.0 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page