Skip to main content

Visualization tools for NLP machine learning models.

Project description

Ecco Logo

Ecco is a python library for explaining Natural Language Processing models using interactive visualizations.

It provides multiple interfaces to aid the explanation and intuition of Transformer-based language models. Read: Interfaces for Explaining Transformer Language Models.

Ecco runs inside Jupyter notebooks. It is built on top of pytorch and transformers.

The library is currently an alpha release of a research project. Not production ready. You’re welcome to contribute to make it better!

Installation

# Assuming you had PyTorch previously installed
pip install ecco

Documentation

To use the project:

import ecco

# Load pre-trained language model. Setting 'activations' to True tells Ecco to capture neuron activations.
lm = ecco.from_pretrained('distilgpt2', activations=True)

# Input text
text = "The countries of the European Union are:\n1. Austria\n2. Belgium\n3. Bulgaria\n4."

# Generate 20 tokens to complete the input text.
output = lm.generate(text, generate=20, do_sample=True)

# Ecco will output each token as it is generated.

# 'output' now contains the data captured from this run, including the input and output tokens
# as well as neuron activations and input saliency values.

# To view the input saliency
output.saliency()

This does the following:

  1. It loads a pretrained Huggingface DistilGPT2 model. It wraps it an ecco LM object that does useful things (e.g. it calculates input saliency, can collect neuron activations).

  2. We tell the model to generate 20 tokens.

  3. The model returns an ecco OutputSeq object. This object holds the output sequence, but also a lot of data generated by the generation run, including the input sequence and input saliency values. If we set activations=True in from_pretrained(), then this would also contain neuron activation values.

  4. output can now produce various interactive explorables. Examples include:

# To view the input saliency explorable
output.saliency()

# to view input saliency with more details (a bar and % value for each token)
output.saliency(style="detailed")

# output.activations contains the neuron activation values. it has the shape: (layer, neuron, token position)

# We can run non-negative matrix factorization using run_nmf. We pass the number of factors/components to break down into
nmf_1 = output.run_nmf(n_components=10)

# nmf_1 now contains the necessary data to create the interactive nmf explorable:
nmf_1.explore()

Changelog

0.0.8 (2020-11-20)

  • Allowing the project some fresh air.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ecco-0.0.15.tar.gz (52.4 kB view details)

Uploaded Source

Built Distribution

ecco-0.0.15-py2.py3-none-any.whl (56.8 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file ecco-0.0.15.tar.gz.

File metadata

  • Download URL: ecco-0.0.15.tar.gz
  • Upload date:
  • Size: 52.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.9.5

File hashes

Hashes for ecco-0.0.15.tar.gz
Algorithm Hash digest
SHA256 711efd71ded4d9afeeae00227b77899b92a162de10518aa8f8780191009a0e0d
MD5 3ae0359dda11e468724812572035a146
BLAKE2b-256 83f4ff419d2b698a9e57aa2d88d258b3a1a4fefebe49f94acb0c45ae13a61098

See more details on using hashes here.

Provenance

File details

Details for the file ecco-0.0.15-py2.py3-none-any.whl.

File metadata

  • Download URL: ecco-0.0.15-py2.py3-none-any.whl
  • Upload date:
  • Size: 56.8 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.9.5

File hashes

Hashes for ecco-0.0.15-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 6d7b07f00be165fb93c5d4e6b7dcf7041b2198ff4c6bb25bc1ff9e223777c9ab
MD5 7a87602572fd8e7fc2ccd08a8eacb3b1
BLAKE2b-256 3a6b08b0c92288ddf261fb8101447fbf897ea0fdbd9fe3d878acd18d64a553b1

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page