Skip to main content

A Python package for extracting structured metadata information from arXiv papers.

Project description

NERxiv logo

CI Coverage Status License: PolyForm NC 1.0.0 PyPI version Python versions

NERxiv

Named Entity Recognition for arxiv papers (NERxiv) is a Python wrapper tool for extracting structured metadata from scientific papers on arXiv using LLMs and modern retrieval-augmented generation (RAG) techniques.

Visit the documentation page to learn how to use this tool.

What It Does

  • Uses pyrxiv to fetch, download, and extract text from arXiv papers
  • Chunks and embeds text with SentenceTransformers or LangChain to categorize papers content using local LLMs (via Ollama)
  • Includes CLI tools and notebook tutorials for reproducible workflows

Installation

Install the core package:

pip install nerxiv

Running LLMs Locally

We recommend running your own models locally using Ollama:

# Install Ollama (follow instructions on their website)
ollama pull <model-name>   # e.g., llama3, deepseek-r1, qwen3:30b

# Start the local server
ollama serve

Development

To contribute to NERxiv or run it locally, follow these steps:

Clone the Repository

git clone https://github.com/JosePizarro3/NERxiv.git
cd NERxiv

Set Up a Virtual Environment

We recommend Python ≥ 3.10:

python3 -m venv .venv
source .venv/bin/activate

Install Dependencies

Use uv (faster than pip) to install the package in editable mode with dev and docu extras:

pip install --upgrade pip
pip install uv
uv pip install -e .[dev,docu]

Run tests

Use pytest with verbosity to run all tests:

python -m pytest -sv tests

To check code coverage:

python -m pytest --cov=nerxiv tests

Code formatting and linting

We use Ruff for formatting and linting (configured via pyproject.toml).

Check linting issues:

ruff check .

Auto-format code:

ruff format . --check

Manually fix anything Ruff cannot handle automatically.

Documentation writing

To view the documentation locally, make sure to have installed the extra [docu] packages:

uv pip install -e '[docu]'

Note: This command installs mkdocs, mkdocs-material, and other documentation-related dependencies.

The first time, build the server:

mkdocs build

Run the documentation server:

mkdocs serve

The output looks like:

INFO    -  Building documentation...
INFO    -  Cleaning site directory
INFO    -  [14:07:47] Watching paths for changes: 'docs', 'mkdocs.yml'
INFO    -  [14:07:47] Serving on http://127.0.0.1:8000/

Simply click on http://127.0.0.1:8000/. The changes in the md files of the documentation are immediately reflected when the files are saved (the local web will automatically refresh).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nerxiv-1.1.0.tar.gz (13.9 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nerxiv-1.1.0-py3-none-any.whl (13.4 MB view details)

Uploaded Python 3

File details

Details for the file nerxiv-1.1.0.tar.gz.

File metadata

  • Download URL: nerxiv-1.1.0.tar.gz
  • Upload date:
  • Size: 13.9 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nerxiv-1.1.0.tar.gz
Algorithm Hash digest
SHA256 538e6867615ccc2f1639c82e47a45f3fa7c99d11cf6ad61446a8ea8c8b22c1fa
MD5 d19436df27ff3ccff5dcd831a91a32dc
BLAKE2b-256 7f89bde78b1a7d605827b97fff33883d3695c8f7fa5a2fcd0cc69412830d9975

See more details on using hashes here.

File details

Details for the file nerxiv-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: nerxiv-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 13.4 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nerxiv-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b46721aacbe383071c778633270879ef30857a7f06042fc9728635dd5eec7de7
MD5 4248a534121e0ffd54ab5dae6464433b
BLAKE2b-256 6421484b0df8adf6008ebfe14f6d287dfbd6d5181e7854c2e4e3d41e6e93a8a7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page