Skip to main content

library for extracting reference from documents

Project description

ScrapeBiblio: PDF Reference Extraction and Verification Library

Powered by Scrapegraphai

Drag Racing

This library is designed to extract references from a PDF file, check them against the Semantic Scholar database, and save the results to a Markdown file.

Overview

The library performs the following steps:

First usage: extracting references from

  1. Extract Text from PDF: Reads the content of a PDF file and extracts the text.
  2. Split Text into Chunks: Splits the extracted text into smaller chunks to manage large texts efficiently.
  3. Extract References: Uses the OpenAI API to extract references from the text.
  4. Save References: Saves the extracted references to a Markdown file.
  5. Check References in Semantic Scholar: (Optional) Checks if the extracted references are present in the Semantic Scholar database.

Installation and Setup

To install the required dependencies, you can use the following command:

pip install scrapebiblio

Ensure you have a .env file in the root directory of your project with the following content:

OPENAI_API_KEY="YOUR_OPENAI_KEY"
SEMANTIC_SCHOLARE_API_KEY="YOUR_SEMANTIC_SCHOLAR_KEY"

Usage

To use the library, ensure you have the required environment variables set and run the script. The extracted references will be saved to a Markdown file named references.md.

Example

Here is an example of how to use the library:

import logging
import os
from dotenv import load_dotenv
from biblio.find_reference import process_pdf

logging.basicConfig(level=logging.DEBUG, format='%(asctime)s - %(levelname)s - %(message)s')

load_dotenv()

def main():
    """
    Main function that processes a PDF, extracts text, and saves the references.
    """
    pdf_path = 'test/558779153.pdf'
    references_output_path = 'references.md'

    openai_api_key = os.getenv('OPENAI_API_KEY')
    semantic_scholar_api_key = os.getenv('SEMANTIC_SCHOLARE_API_KEY')

    if not openai_api_key:
        raise EnvironmentError("OPENAI_API_KEY environment variable not set.")
    if not semantic_scholar_api_key:
        raise EnvironmentError("SEMANTIC_SCHOLARE_API_KEY environment variable not set.")

    logging.debug("Starting PDF processing...")

    process_pdf(pdf_path, references_output_path, openai_api_key, semantic_scholar_api_key)

    logging.debug("Processing completed.")

if __name__ == "__main__":
    main()

Contributing

We welcome contributions to this project. If you would like to contribute, please follow these steps:

  1. Fork the repository.
  2. Create a new branch for your feature or bugfix.
  3. Make your changes.
  4. Submit a pull request with a detailed description of your changes.

License

This project is licensed under the MIT License. See the LICENSE file for more information.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapebiblio-0.0.2.tar.gz (1.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scrapebiblio-0.0.2-py3-none-any.whl (10.5 kB view details)

Uploaded Python 3

File details

Details for the file scrapebiblio-0.0.2.tar.gz.

File metadata

  • Download URL: scrapebiblio-0.0.2.tar.gz
  • Upload date:
  • Size: 1.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.3

File hashes

Hashes for scrapebiblio-0.0.2.tar.gz
Algorithm Hash digest
SHA256 4a759a964ca44d7aa728357b2484c878b258d28fad276ed569b2bdde7153df78
MD5 5e766b13bb6a2c527b12c73f52ceda94
BLAKE2b-256 e9a39da635a5931626be0fef0085ac37bdceab42bc22bafc9fd7d1bf4dfadf42

See more details on using hashes here.

File details

Details for the file scrapebiblio-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: scrapebiblio-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 10.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.3

File hashes

Hashes for scrapebiblio-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 92e5a511217ca4f02e9414baeb23b505549be704c29bbe0f2b8433667d99203d
MD5 6e1112a34f3547b97f7e43b87298e66c
BLAKE2b-256 ec29ccb8edd41afc8c32f9257237f8002749c12e701759830c4a41d4e9ec2208

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page