Skip to main content

Visualization tools for NLP machine learning models.

Project description

Ecco Logo

Ecco is a python library for explaining Natural Language Processing models using interactive visualizations.

It provides multiple interfaces to aid the explanation and intuition of Transformer-based language models. Read: Interfaces for Explaining Transformer Language Models.

Ecco runs inside Jupyter notebooks. It is built on top of pytorch and transformers.

The library is currently an alpha release of a research project. Not production ready. You’re welcome to contribute to make it better!

Installation

# Assuming you had PyTorch previously installed
pip install ecco

Documentation

To use the project:

import ecco

# Load pre-trained language model. Setting 'activations' to True tells Ecco to capture neuron activations.
lm = ecco.from_pretrained('distilgpt2', activations=True)

# Input text
text = "The countries of the European Union are:\n1. Austria\n2. Belgium\n3. Bulgaria\n4."

# Generate 20 tokens to complete the input text.
output = lm.generate(text, generate=20, do_sample=True)

# Ecco will output each token as it is generated.

# 'output' now contains the data captured from this run, including the input and output tokens
# as well as neuron activations and input saliency values.

# To view the input saliency
output.saliency()

This does the following:

  1. It loads a pretrained Huggingface DistilGPT2 model. It wraps it an ecco LM object that does useful things (e.g. it calculates input saliency, can collect neuron activations).

  2. We tell the model to generate 20 tokens.

  3. The model returns an ecco OutputSeq object. This object holds the output sequence, but also a lot of data generated by the generation run, including the input sequence and input saliency values. If we set activations=True in from_pretrained(), then this would also contain neuron activation values.

  4. output can now produce various interactive explorables. Examples include:

# To view the input saliency explorable
output.saliency()

# to view input saliency with more details (a bar and % value for each token)
output.saliency(style="detailed")

# output.activations contains the neuron activation values. it has the shape: (layer, neuron, token position)

# We can run non-negative matrix factorization using run_nmf. We pass the number of factors/components to break down into
nmf_1 = output.run_nmf(n_components=10)

# nmf_1 now contains the necessary data to create the interactive nmf explorable:
nmf_1.explore()

Changelog

0.0.8 (2020-11-20)

  • Allowing the project some fresh air.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ecco-0.0.14.tar.gz (55.6 kB view details)

Uploaded Source

Built Distribution

ecco-0.0.14-py2.py3-none-any.whl (76.8 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file ecco-0.0.14.tar.gz.

File metadata

  • Download URL: ecco-0.0.14.tar.gz
  • Upload date:
  • Size: 55.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/49.6.0.post20201009 requests-toolbelt/0.9.1 tqdm/4.55.1 CPython/3.8.5

File hashes

Hashes for ecco-0.0.14.tar.gz
Algorithm Hash digest
SHA256 02bceb09cee6ea06d2ab01ba4293d0079c261ef1eb15b1e03ccda3fa21a2785c
MD5 ece29fee4e407eaed478eb39fbd0a232
BLAKE2b-256 5e391d7197b685396b60d3b8cce3f8d610a336e2e0d5c050b33a41659054dd8b

See more details on using hashes here.

Provenance

File details

Details for the file ecco-0.0.14-py2.py3-none-any.whl.

File metadata

  • Download URL: ecco-0.0.14-py2.py3-none-any.whl
  • Upload date:
  • Size: 76.8 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/49.6.0.post20201009 requests-toolbelt/0.9.1 tqdm/4.55.1 CPython/3.8.5

File hashes

Hashes for ecco-0.0.14-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 1372b863727024dff972195db2912102fdac2877f70e74877c478273bea697e3
MD5 8f4c5b3ecdf1ffa964f1030418e129cf
BLAKE2b-256 49143a14afc6f76c037ace947895835bf363c0ff63cd2c1283e82e245c8031a2

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page