Skip to main content

library for extracting reference from documents

Project description

ScrapeBiblio: PDF Reference Extraction and Verification Library

Powered by Scrapegraphai

Drag Racing Downloads

This library is designed to extract references from a PDF file, check them against the Semantic Scholar database, and save the results to a Markdown file.

Overview

The library performs the following steps:

First usage: extracting references from

  1. Extract Text from PDF: Reads the content of a PDF file and extracts the text.
  2. Split Text into Chunks: Splits the extracted text into smaller chunks to manage large texts efficiently.
  3. Extract References: Uses the OpenAI API to extract references from the text.
  4. Save References: Saves the extracted references to a Markdown file.
  5. Check References in Semantic Scholar: (Optional) Checks if the extracted references are present in the Semantic Scholar database.

Installation and Setup

To install the required dependencies, you can use the following command:

pip install scrapebiblio

Ensure you have a .env file in the root directory of your project with the following content:

OPENAI_API_KEY="YOUR_OPENAI_KEY"
SEMANTIC_SCHOLARE_API_KEY="YOUR_SEMANTIC_SCHOLAR_KEY"

Usage

To use the library, ensure you have the required environment variables set and run the script. The extracted references will be saved to a Markdown file named references.md.

Example

Here is an example of how to use the library:

import logging
import os
from dotenv import load_dotenv
from biblio.find_reference import process_pdf

logging.basicConfig(level=logging.DEBUG, format='%(asctime)s - %(levelname)s - %(message)s')

load_dotenv()

def main():
    """
    Main function that processes a PDF, extracts text, and saves the references.
    """
    pdf_path = 'test/558779153.pdf'
    references_output_path = 'references.md'

    openai_api_key = os.getenv('OPENAI_API_KEY')
    semantic_scholar_api_key = os.getenv('SEMANTIC_SCHOLARE_API_KEY')

    if not openai_api_key:
        raise EnvironmentError("OPENAI_API_KEY environment variable not set.")
    if not semantic_scholar_api_key:
        raise EnvironmentError("SEMANTIC_SCHOLARE_API_KEY environment variable not set.")

    logging.debug("Starting PDF processing...")

    process_pdf(pdf_path, references_output_path, openai_api_key, semantic_scholar_api_key)

    logging.debug("Processing completed.")

if __name__ == "__main__":
    main()

Contributing

We welcome contributions to this project. If you would like to contribute, please follow these steps:

  1. Fork the repository.
  2. Create a new branch for your feature or bugfix.
  3. Make your changes.
  4. Submit a pull request with a detailed description of your changes.

License

This project is licensed under the MIT License. See the LICENSE file for more information.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapebiblio-1.1.0.tar.gz (1.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scrapebiblio-1.1.0-py3-none-any.whl (10.5 kB view details)

Uploaded Python 3

File details

Details for the file scrapebiblio-1.1.0.tar.gz.

File metadata

  • Download URL: scrapebiblio-1.1.0.tar.gz
  • Upload date:
  • Size: 1.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for scrapebiblio-1.1.0.tar.gz
Algorithm Hash digest
SHA256 3c9ebe62194f8aaf061ea72c0acd6beac770e4f8ee1982f5ab4a0c82a0d1c6e1
MD5 330651f5e9370fe8a3ca555a24579473
BLAKE2b-256 8428cb886d935779593bd5d2ed1ab430d9fb7edeab4df154908e4aa63f9ab204

See more details on using hashes here.

File details

Details for the file scrapebiblio-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: scrapebiblio-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 10.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for scrapebiblio-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 613e0e914cac44430792e8fef821cb39a4e493024ddab21380310ed6dfcdb1da
MD5 0e2ce076fa861d2563309f811dac8b6c
BLAKE2b-256 0fc12cc0c8dbda00e296af7d9ba0f247b46197ce90d04442b088291b702c849f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page