Skip to main content

Transformer-based Text Operators

Project description

Textformer: Transformer-Based Text Operators

Latest release Open issues License

Welcome to Textformer.

Did you ever want to transform text? Are you tired of re-implementing and defining state-of-the-art architectures? If yes, Textformer is the way-to-go! This package provides a straightforward implementation of sequence-to-sequence and transformer-based architectures, fostering all research related to text generation and translation.

Use Textformer if you need a library or wish to:

  • Create your network;
  • Design or use pre-loaded state-of-the-art architectures;
  • Mix-and-match encoder and decoders to solve your problem;
  • Because it is fun to transform text.

Read the docs at textformer.readthedocs.io.

Textformer is compatible with: Python 3.6+.


Package guidelines

  1. The very first information you need is in the very next section.
  2. Installing is also easy if you wish to read the code and bump yourself into, follow along.
  3. Note that there might be some additional steps in order to use our solutions.
  4. If there is a problem, please do not hesitate, call us.

Getting started: 60 seconds with Textformer

First of all. We have examples. Yes, they are commented. Just browse to examples/, chose your subpackage, and follow the example. We have high-level examples for most tasks we could think of.

Alternatively, if you wish to learn even more, please take a minute:

Textformer is based on the following structure, and you should pay attention to its tree:

- textformer
    - core
        - model
    - datasets
        - generative
        - translation
    - models
        - decoders
            - bi_gru
            - conv
            - gru
            - lstm
            - self_attention
        - encoders
            - bi_gru
            - conv
            - gru
            - lstm
            - self_attention
        - layers
            - attention
            - multi_head_attention
            - position_wide_forward
            - residual_attention
        - att_seq2seq
        - conv_seq2seq
        - joint_seq2seq
        - seq2seq
        - transformer
    - utils
        - constants
        - exception
        - logging
        - visualization

Core

The core is the core. Essentially, it is the parent of everything. You should find parent classes defining the basis of our structure. They should provide variables and methods that will help to construct other modules.

Datasets

Because we need data, right? Datasets are composed of classes and methods that allow preparing data for further transformers.

Models

Each neural network architecture is defined in this package. From Seq2Seq to Transformers, you can use whatever suits your needs.

Utils

This is a utility package. Common things shared across the application should be implemented here. It is better to implement once and use it as you wish than re-implementing the same thing over and over again.


Installation

We believe that everything has to be easy. Not tricky or daunting, Textformer will be the one-to-go package that you will need, from the very first installation to the daily-tasks implementing needs. If you may just run the following under your most preferred Python environment (raw, conda, virtualenv, whatever)!:

pip install textformer

Alternatively, if you prefer to install the bleeding-edge version, please clone this repository and use:

pip install .

Environment configuration

Note that sometimes, there is a need for additional implementation. If needed, from here, you will be the one to know all of its details.

Ubuntu

No specific additional commands needed.

Windows

No specific additional commands needed.

MacOS

No specific additional commands needed.


Support

We know that we do our best, but it is inevitable to acknowledge that we make mistakes. If you ever need to report a bug, report a problem, talk to us, please do so! We will be available at our bests at this repository or gustavo.rosa@unesp.br.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

textformer-1.0.1.tar.gz (23.1 kB view details)

Uploaded Source

Built Distribution

textformer-1.0.1-py3-none-any.whl (57.7 kB view details)

Uploaded Python 3

File details

Details for the file textformer-1.0.1.tar.gz.

File metadata

  • Download URL: textformer-1.0.1.tar.gz
  • Upload date:
  • Size: 23.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3.post20200330 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.7.7

File hashes

Hashes for textformer-1.0.1.tar.gz
Algorithm Hash digest
SHA256 6ce3f181b2932103214ad55fc3715e687fe0c09edd8c964e6aa54bb96d8cde44
MD5 7ad9ee59db4940b4310b1b0822a6352d
BLAKE2b-256 25600e417c5b3edee2a99415f8f3b3f2181d3898397a4788039c6a84b22fead3

See more details on using hashes here.

File details

Details for the file textformer-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: textformer-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 57.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3.post20200330 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.7.7

File hashes

Hashes for textformer-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f767113d2fc41c17d28aafc1267fb44b7f25087ba524bc643789573cfbe8b77c
MD5 aed0cf70d40fa4df657c7f5e84184f2e
BLAKE2b-256 7616a36c26d0e14103e184192b44d4a8bd07c5b62d4173c2647e4a7fbbf1f73c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page