Skip to main content

T5 Summarisation Using Pytorch Lightning

Project description


title: T5-Summarisation emoji: ✌ colorFrom: yellow colorTo: red sdk: streamlit app_file: src/visualization/visualize.py pinned: false

summarization

T5 Summarisation Using Pytorch Lightning

Instructions

  1. Clone the repo.
  2. Edit the params.yml to change the parameters to train the model.
  3. Run make dirs to create the missing parts of the directory structure described below.
  4. Optional: Run make virtualenv to create a python virtual environment. Skip if using conda or some other env manager.
    1. Run source env/bin/activate to activate the virtualenv.
  5. Run make requirements to install required python packages.
  6. Process your data, train and evaluate your model using make run
  7. When you're happy with the result, commit files (including .dvc files) to git.

Project Organization

├── LICENSE
├── Makefile           <- Makefile with commands like `make dirs` or `make clean`
├── README.md          <- The top-level README for developers using this project.
├── data
│   ├── processed      <- The final, canonical data sets for modeling.
│   └── raw            <- The original, immutable data dump.
│
├── models             <- Trained and serialized models, model predictions, or model summaries
│
├── notebooks          <- Jupyter notebooks. Naming convention is a number (for ordering),
│                         the creator's initials, and a short `-` delimited description, e.g.
│                         `1.0-jqp-initial-data-exploration`.
├── references         <- Data dictionaries, manuals, and all other explanatory materials.
│
├── reports            <- Generated analysis as HTML, PDF, LaTeX, etc.
│   └── metrics.txt    <- Relevant metrics after evaluating the model.
│   └── training_metrics.txt    <- Relevant metrics from training the model.
│
├── requirements.txt   <- The requirements file for reproducing the analysis environment
│
├── setup.py           <- makes project pip installable (pip install -e .) so src can be imported
├── src                <- Source code for use in this project.
│   ├── __init__.py    <- Makes src a Python module
│   │
│   ├── data           <- Scripts to download or generate data
│   │   └── make_dataset.py
│   │   └── process_data.py
│   │
│   ├── models         <- Scripts to train models 
│   │   ├── predict_model.py
│   │   └── train_model.py
│   │   └── evaluate_model.py
│   │   └── model.py
│   │
│   └── visualization  <- Scripts to create exploratory and results oriented visualizations
│       └── visualize.py
│
├── tox.ini            <- tox file with settings for running tox; see tox.testrun.org
└── data.dvc          <- Traing a model on the processed data.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

t5s-0.1.2.tar.gz (3.1 kB view details)

Uploaded Source

Built Distribution

t5s-0.1.2-py3-none-any.whl (3.6 kB view details)

Uploaded Python 3

File details

Details for the file t5s-0.1.2.tar.gz.

File metadata

  • Download URL: t5s-0.1.2.tar.gz
  • Upload date:
  • Size: 3.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.7.9

File hashes

Hashes for t5s-0.1.2.tar.gz
Algorithm Hash digest
SHA256 fe324245899b227a0aacc940953787c72e5412fb50e77d6b9ef666e3b2b8bc83
MD5 607e95d1267bf64a663969c1bbe3fcd1
BLAKE2b-256 ef687c18f19daaebb9b643915fa3793bba06b3a720ae768c94f66f77d87e8080

See more details on using hashes here.

File details

Details for the file t5s-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: t5s-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 3.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.7.9

File hashes

Hashes for t5s-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b39dc65438133e43cbb19b89614103022d447200a1f99584ee64b0c523f88d99
MD5 747fee3c8b4e9f64107629e8a6ae76fd
BLAKE2b-256 96fd661b78f6b8b61d82ac751e17f9d0c5105a5c362f69b91b412c535ab3dd5a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page