Skip to main content

Model hub for transformers.

Project description

Usage Sample ''''''''''''

.. code:: python

    from sklearn.model_selection import train_test_split
    import torch
    from transformers import BertTokenizer
    from nlpx.dataset import TextDataset, text_collate
    from nlpx.model.wrapper import ClassifyModelWrapper
    from transformers_model import AutoCNNTextClassifier, AutoCNNTokenClassifier, \
            BertDataset, BertCollator, BertTokenizeCollator

    texts = [[str],]
    labels = [0, 0, 1, 2, 1...]
    pretrained_path = "clue/albert_chinese_tiny"
    classes = ['class1', 'class2', 'class3'...]
    train_texts, test_texts, y_train, y_test = train_test_split(texts, labels, test_size=0.2)
    
    train_set = TextDataset(train_texts, y_train)
    test_set = TextDataset(test_texts, y_test)

    ################################### TextClassifier ##################################
    model = AutoCNNTextClassifier(pretrained_path, len(classes))
    wrapper = ClassifyModelWrapper(model, classes)
    _ = wrapper.train(train_set, test_set, collate_fn=text_collate)

    ################################### TokenClassifier #################################
    tokenizer = BertTokenizer.from_pretrained(pretrained_path)

    ##################### BertTokenizeCollator #########################
    model = AutoCNNTokenClassifier(pretrained_path, len(classes))
    wrapper = ClassifyModelWrapper(model, classes)
    _ = wrapper.train(train_set, test_set, collate_fn=BertTokenizeCollator(tokenizer, 256))

    ##################### BertCollator ##################################
    train_tokenizies = tokenizer.batch_encode_plus(
            train_texts,
            max_length=256,
            padding="max_length",
            truncation=True,
            return_token_type_ids=True,
            return_attention_mask=True,
            return_tensors="pt",
    )

    test_tokenizies = tokenizer.batch_encode_plus(
            test_texts,
            max_length=256,
            padding="max_length",
            truncation=True,
            return_token_type_ids=True,
            return_attention_mask=True,
            return_tensors="pt",
    )

    train_set = BertDataset(train_tokenizies, y_train)
    test_set = BertDataset(test_tokenizies, y_test)

    model = AutoCNNTokenClassifier(pretrained_path, len(classes))
    wrapper = ClassifyModelWrapper(model, classes)
    _ = wrapper.train(train_set, test_set, collate_fn=BertCollator())

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

transformers-model-0.1.2.tar.gz (8.3 kB view details)

Uploaded Source

File details

Details for the file transformers-model-0.1.2.tar.gz.

File metadata

  • Download URL: transformers-model-0.1.2.tar.gz
  • Upload date:
  • Size: 8.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.18

File hashes

Hashes for transformers-model-0.1.2.tar.gz
Algorithm Hash digest
SHA256 bbda06fb00df2e9d0d7f22227fbdd2e274a006627bc795ec03c89b836e251a23
MD5 36b629481e36c8e47a5206151c06e1b0
BLAKE2b-256 e5f43dbd9d47aa8679031a72fe871e3c0f728b54a10e9341647ced328297926a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page