Skip to main content

T5 Summarisation Using Pytorch Lightning

Project description


title: T5-Summarisation emoji: ✌ colorFrom: yellow colorTo: red sdk: streamlit app_file: src/visualization/visualize.py pinned: false

summarization

T5 Summarisation Using Pytorch Lightning

Instructions

  1. Clone the repo.
  2. Edit the params.yml to change the parameters to train the model.
  3. Run make dirs to create the missing parts of the directory structure described below.
  4. Optional: Run make virtualenv to create a python virtual environment. Skip if using conda or some other env manager.
    1. Run source env/bin/activate to activate the virtualenv.
  5. Run make requirements to install required python packages.
  6. Process your data, train and evaluate your model using make run
  7. When you're happy with the result, commit files (including .dvc files) to git.

Project Organization

├── LICENSE
├── Makefile           <- Makefile with commands like `make dirs` or `make clean`
├── README.md          <- The top-level README for developers using this project.
├── data
│   ├── processed      <- The final, canonical data sets for modeling.
│   └── raw            <- The original, immutable data dump.
│
├── models             <- Trained and serialized models, model predictions, or model summaries
│
├── notebooks          <- Jupyter notebooks. Naming convention is a number (for ordering),
│                         the creator's initials, and a short `-` delimited description, e.g.
│                         `1.0-jqp-initial-data-exploration`.
├── references         <- Data dictionaries, manuals, and all other explanatory materials.
│
├── reports            <- Generated analysis as HTML, PDF, LaTeX, etc.
│   └── metrics.txt    <- Relevant metrics after evaluating the model.
│   └── training_metrics.txt    <- Relevant metrics from training the model.
│
├── requirements.txt   <- The requirements file for reproducing the analysis environment
│
├── setup.py           <- makes project pip installable (pip install -e .) so src can be imported
├── src                <- Source code for use in this project.
│   ├── __init__.py    <- Makes src a Python module
│   │
│   ├── data           <- Scripts to download or generate data
│   │   └── make_dataset.py
│   │   └── process_data.py
│   │
│   ├── models         <- Scripts to train models 
│   │   ├── predict_model.py
│   │   └── train_model.py
│   │   └── evaluate_model.py
│   │   └── model.py
│   │
│   └── visualization  <- Scripts to create exploratory and results oriented visualizations
│       └── visualize.py
│
├── tox.ini            <- tox file with settings for running tox; see tox.testrun.org
└── data.dvc          <- Traing a model on the processed data.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

t5s-0.1.0.tar.gz (3.1 kB view details)

Uploaded Source

Built Distribution

t5s-0.1.0-py3-none-any.whl (3.6 kB view details)

Uploaded Python 3

File details

Details for the file t5s-0.1.0.tar.gz.

File metadata

  • Download URL: t5s-0.1.0.tar.gz
  • Upload date:
  • Size: 3.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.7.9

File hashes

Hashes for t5s-0.1.0.tar.gz
Algorithm Hash digest
SHA256 47a781e46bde81da8ea12f17ad462124bc2eae577c273869092a1b7789adc790
MD5 7a92e2bafc1893cbe13fbbd3dda34b8a
BLAKE2b-256 33fcbfdf947ba7de6eac127594420e697faa234ff82780fb1005ddfe06bd11f0

See more details on using hashes here.

File details

Details for the file t5s-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: t5s-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 3.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.7.9

File hashes

Hashes for t5s-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5f49926d649eac4f4efd12852ac6c57393a51254928fc37ab99e680ac256b90f
MD5 43bc99fd467dbc25a513f982aaf53f18
BLAKE2b-256 b951db074e7df4d2e704b6cb5505933db9349c6ca93cf89a01f33e04b4cc7cb6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page