Skip to main content

No project description provided

Project description

Wordcloud Scripts

Description

This is a collection of scripts that manipulate report data files (downloaded from Susmon front-end) and other report data files created by researchers to create counts of the top words create word clouds.

Key scripts:

main.py - Main program loop that determines actions to be taken and calls relevant methods
esg.py - ESG word clouds. Works from the standard report data file downloaded from the database and from a bespoke report created by research
netzero.py - Net Zero word clouds and word counts. Works from a filtered report file provide by research
Other modules contain common fuctions used by both scripts

Installation

Scripts are written in Python 3. Libraries and version identified in requirements.txt

There are two control files that must be in place for the scripts to run:

stopwords.csv - generic words and phrases to be excluded from word clouds
phrases.csv - word combinations that should be treated as phrases in word clouds 

These files can be uploaded and updated via the main code loop as decribed below.

Usage

The code is run as follows:

python -m src.clouds.main <input file> <action "create"/"update"> [--resource <"phrases"/"stopwords">] [--sector <sector name>] [--classification <"Environmental", "Social", "Governance">]

When used with the action argument "create", the program will use the "input file" to generate word clouds as follows:

if <input file> name contains "NZ", or "Net Zero" a set of Net Zero word clouds will be created
otherwise
if a <sector name> is supplied, word clouds will be generated for each brand owners in that sector and each classification. In addition word clouds will be created for all brand owners comnined for each classification and for all brand owners across all classifications. The input file in this case should be a standard report data file downloaded from the database
if no <sector name> is supplied and a <classification> other than "All" has been supplied, a word cloud will be created for the <classification> supplied for all brands in the input file combined. The input file in this case should be one provided for this purpose by the research team.
if no <sector name> is supplied and a <classification> of "All" has been supplied, a word cloud will be created for the all classifications and all brands in the input file combined. The input file in this case should be one provided for this purpose by the research team.

When used with the action argument "update", the program will use the "input file" to update one of the two control files as determined by the resource argument "phrases", or "stopwords".

Support

[to be written]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cloudvis-0.0.20.tar.gz (85.9 kB view details)

Uploaded Source

Built Distribution

cloudvis-0.0.20-py3-none-any.whl (12.2 kB view details)

Uploaded Python 3

File details

Details for the file cloudvis-0.0.20.tar.gz.

File metadata

  • Download URL: cloudvis-0.0.20.tar.gz
  • Upload date:
  • Size: 85.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for cloudvis-0.0.20.tar.gz
Algorithm Hash digest
SHA256 b3b7dabe4b6755853e7b9a65b2921fe69e28b72c5b25e7adba929e6de7872b90
MD5 07c3a01d46b12964102e9ef81664dd6b
BLAKE2b-256 6941fa3521351452004b64bce5350b39e6b40c85014e345cca2cce0e12cd662b

See more details on using hashes here.

File details

Details for the file cloudvis-0.0.20-py3-none-any.whl.

File metadata

  • Download URL: cloudvis-0.0.20-py3-none-any.whl
  • Upload date:
  • Size: 12.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for cloudvis-0.0.20-py3-none-any.whl
Algorithm Hash digest
SHA256 ffcba99f0f6491b41406e6aa64646bf72c61890efe23d1a7b65f569f4321c564
MD5 200497238d92793b7b8e931e1f54ef67
BLAKE2b-256 caf1a237e9e55c117958da455b88c4dce671338a16889926afc56b1742ad64bb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page