Skip to main content

library for extracting reference from documents

Project description

ScrapeBiblio: PDF Reference Extraction and Verification Library

Powered by Scrapegraphai

Drag Racing

This library is designed to extract references from a PDF file, check them against the Semantic Scholar database, and save the results to a Markdown file.

Overview

The library performs the following steps:

First usage: extracting references from

  1. Extract Text from PDF: Reads the content of a PDF file and extracts the text.
  2. Split Text into Chunks: Splits the extracted text into smaller chunks to manage large texts efficiently.
  3. Extract References: Uses the OpenAI API to extract references from the text.
  4. Save References: Saves the extracted references to a Markdown file.
  5. Check References in Semantic Scholar: (Optional) Checks if the extracted references are present in the Semantic Scholar database.

Installation and Setup

To install the required dependencies, you can use the following command:

pip install scrapebiblio

Ensure you have a .env file in the root directory of your project with the following content:

OPENAI_API_KEY="YOUR_OPENAI_KEY"
SEMANTIC_SCHOLARE_API_KEY="YOUR_SEMANTIC_SCHOLAR_KEY"

Usage

To use the library, ensure you have the required environment variables set and run the script. The extracted references will be saved to a Markdown file named references.md.

Example

Here is an example of how to use the library:

import logging
import os
from dotenv import load_dotenv
from biblio.find_reference import process_pdf

logging.basicConfig(level=logging.DEBUG, format='%(asctime)s - %(levelname)s - %(message)s')

load_dotenv()

def main():
    """
    Main function that processes a PDF, extracts text, and saves the references.
    """
    pdf_path = 'test/558779153.pdf'
    references_output_path = 'references.md'

    openai_api_key = os.getenv('OPENAI_API_KEY')
    semantic_scholar_api_key = os.getenv('SEMANTIC_SCHOLARE_API_KEY')

    if not openai_api_key:
        raise EnvironmentError("OPENAI_API_KEY environment variable not set.")
    if not semantic_scholar_api_key:
        raise EnvironmentError("SEMANTIC_SCHOLARE_API_KEY environment variable not set.")

    logging.debug("Starting PDF processing...")

    process_pdf(pdf_path, references_output_path, openai_api_key, semantic_scholar_api_key)

    logging.debug("Processing completed.")

if __name__ == "__main__":
    main()

Contributing

We welcome contributions to this project. If you would like to contribute, please follow these steps:

  1. Fork the repository.
  2. Create a new branch for your feature or bugfix.
  3. Make your changes.
  4. Submit a pull request with a detailed description of your changes.

License

This project is licensed under the MIT License. See the LICENSE file for more information.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapebiblio-0.0.3.tar.gz (1.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scrapebiblio-0.0.3-py3-none-any.whl (10.5 kB view details)

Uploaded Python 3

File details

Details for the file scrapebiblio-0.0.3.tar.gz.

File metadata

  • Download URL: scrapebiblio-0.0.3.tar.gz
  • Upload date:
  • Size: 1.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.3

File hashes

Hashes for scrapebiblio-0.0.3.tar.gz
Algorithm Hash digest
SHA256 0570374e602800a536d9b9f2b224882932e46bbadadf8631dad780170c0ed428
MD5 a1cb7ae9ff22c21553e9001149c201f5
BLAKE2b-256 366b8de93ea9fe25c47a55c2db271a56a198c6fd0b32162da9123d2dca2dbdba

See more details on using hashes here.

File details

Details for the file scrapebiblio-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: scrapebiblio-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 10.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.3

File hashes

Hashes for scrapebiblio-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 7cc16cd8bf3c7764a92ce60b8fb46571f4253fdd521cdcf35faaebbaed5482aa
MD5 e48a14d093518c045439cad0a0a467c0
BLAKE2b-256 99468be38d6757941d804052f436c89ef1a9071405a02aa4186c3280ed78fa56

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page