Homologous Automated Document Exploration and Summarization - A powerful tool for comparing similarly structured documents
Project description
HADES: Homologous Automated Document Exploration and Summarization
A powerful tool for comparing similarly structured documents
Overview
HADES
is a Python package for comparing similarly structured documents. HADES is designed to streamline the work of professionals dealing with large volumes of documents, such as policy documents, legal acts, and scientific papers. The tool employs a multi-step pipeline that begins with processing PDF documents using topic modeling, summarization, and analysis of the most important words for each topic. The process concludes with an interactive web app with visualizations that facilitate the comparison of the documents. HADES has the potential to significantly improve the productivity of professionals dealing with high volumes of documents, reducing the time and effort required to complete tasks related to comparative document analysis.
Installation
Latest released version of the HADES
package is available on Python Package Index (PyPI):
-
Install spacy
en-core-web-lg
anden-core-web-sm
models for English language according to the instructions -
Install
HADES
package using pip:
pip install -U hades-nlp
The source code and development version is currently hosted on GitHub.
Usage
The HADES
package is designed to be used in a Python environment. The package can be imported as follows:
from hades.data_loading import load_processed_data
from hades.topic_modeling import ModelOptimizer, save_data_for_app, set_openai_key
from my_documents_data import PARAGRAPHS, COMMON_WORDS, STOPWORDS
The load_processed_data
function loads the documents to be processed. The ModelOptimizer
class is used to optimize the topic modeling process. The save_data_for_app
function saves the data for the interactive web app. The set_openai_key
function sets the OpenAI API key.
my_documents_data
contains the informations about the documents to be processed. The PARAGRAPHS
variable is a list of strings that represent the paragraphs of the documents. The COMMON_WORDS
variable is a list of strings that represent the most common words in the documents. The STOPWORDS
variable is a list of strings that represent the most common words in the documents that should be excluded from the analysis.
First, the documents are loaded and processed:
set_openai_key("my openai key")
data_path = "my/data/path"
processed_df = load_processed_data(
data_path=data_path,
stop_words=STOPWORDS,
id_column='country',
flattened_by_col='my_column',
)
After the documents are loaded, the topic modeling process is optimized for each paragraph:
model_optimizers = []
for paragraph in PARAGRAPHS:
filter_dict = {'paragraph': paragraph}
model_optimizer = ModelOptimizer(
processed_df,
'country',
'section',
filter_dict,
"lda",
COMMON_WORDS[paragraph],
(3,6),
alpha=100
)
model_optimizer.name_topics_automatically_gpt3()
model_optimizers.append(model_optimizer)
For each paragraph, the ModelOptimizer
class is used to optimize the topic modeling process. The name_topics_automatically_gpt3
function automatically names the topics using the OpenAI GPT-3 API. User can also use the name_topics_manually
function to manually name the topics.
Finally, the data is saved for the interactive web app:
save_data_for_app(model_optimizers, path='path/to/results', do_summaries=True)
The save_data_for_app
function saves the data for the interactive web app. The do_summaries
parameter is set to True
to generate summaries for each topic.
When the data is saved, the interactive web app can be launched:
hades run-app --config path/to/results/config.json
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file hades_nlp-0.1.0.tar.gz
.
File metadata
- Download URL: hades_nlp-0.1.0.tar.gz
- Upload date:
- Size: 29.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.2 CPython/3.10.2 Windows/10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7e753bc035eca8637fe343fa24cc88df7e6abad9af25ff48effa5e3471a2a826 |
|
MD5 | 97352e0ddae51a37c4fd86003617d18f |
|
BLAKE2b-256 | 3a8b034ae1e3ac5f6cfe50e4297e6faf31b35bfc3d902923a1313ab1e0876471 |
File details
Details for the file hades_nlp-0.1.0-py3-none-any.whl
.
File metadata
- Download URL: hades_nlp-0.1.0-py3-none-any.whl
- Upload date:
- Size: 36.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.2 CPython/3.10.2 Windows/10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | af33ac2070b698cb18b8f0c0edaf80d80d964069f04186bbcc21897223dde488 |
|
MD5 | 54e5f3a6f1cbeb072783e9d08504109b |
|
BLAKE2b-256 | 1bc10febe8b38db5436ade4bea0ccdc98f8a9e70fd34e3fc1d445c70f200aae3 |