Skip to main content

A Python package for extracting structured metadata information from arXiv papers.

Project description

NERxiv logo

CI Coverage Status License: PolyForm NC 1.0.0 PyPI version Python versions

NERxiv

Named Entity Recognition for arxiv papers (NERxiv) is a Python wrapper tool for extracting structured metadata from scientific papers on arXiv using LLMs and modern retrieval-augmented generation (RAG) techniques.

Visit the documentation page to learn how to use this tool.

What It Does

  • Uses pyrxiv to fetch, download, and extract text from arXiv papers
  • Chunks and embeds text with SentenceTransformers or LangChain to categorize papers content using local LLMs (via Ollama)
  • Includes CLI tools and notebook tutorials for reproducible workflows

Installation

Install the core package:

pip install nerxiv

Running LLMs Locally

We recommend running your own models locally using Ollama:

# Install Ollama (follow instructions on their website)
ollama pull <model-name>   # e.g., llama3, deepseek-r1, qwen3:30b

# Start the local server
ollama serve

Development

To contribute to NERxiv or run it locally, follow these steps:

Clone the Repository

git clone https://github.com/JosePizarro3/NERxiv.git
cd NERxiv

Set Up a Virtual Environment

We recommend Python ≥ 3.10:

python3 -m venv .venv
source .venv/bin/activate

Install Dependencies

Use uv (faster than pip) to install the package in editable mode with dev and docu extras:

pip install --upgrade pip
pip install uv
uv pip install -e .[dev,docu]

Run tests

Use pytest with verbosity to run all tests:

python -m pytest -sv tests

To check code coverage:

python -m pytest --cov=nerxiv tests

Code formatting and linting

We use Ruff for formatting and linting (configured via pyproject.toml).

Check linting issues:

ruff check .

Auto-format code:

ruff format . --check

Manually fix anything Ruff cannot handle automatically.

Documentation writing

To view the documentation locally, make sure to have installed the extra [docu] packages:

uv pip install -e '[docu]'

Note: This command installs mkdocs, mkdocs-material, and other documentation-related dependencies.

The first time, build the server:

mkdocs build

Run the documentation server:

mkdocs serve

The output looks like:

INFO    -  Building documentation...
INFO    -  Cleaning site directory
INFO    -  [14:07:47] Watching paths for changes: 'docs', 'mkdocs.yml'
INFO    -  [14:07:47] Serving on http://127.0.0.1:8000/

Simply click on http://127.0.0.1:8000/. The changes in the md files of the documentation are immediately reflected when the files are saved (the local web will automatically refresh).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nerxiv-1.0.1.tar.gz (13.9 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nerxiv-1.0.1-py3-none-any.whl (13.4 MB view details)

Uploaded Python 3

File details

Details for the file nerxiv-1.0.1.tar.gz.

File metadata

  • Download URL: nerxiv-1.0.1.tar.gz
  • Upload date:
  • Size: 13.9 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nerxiv-1.0.1.tar.gz
Algorithm Hash digest
SHA256 1efbf1b1fa73a06bf4199f7c999d48e005ee6f7088d575b652eb528cc3771167
MD5 5562a09bba1b27447564027d05907054
BLAKE2b-256 8411e527c117ba11b974723ddbc1bbe51e907b297b6d6b33fe9d5b009ddaded6

See more details on using hashes here.

File details

Details for the file nerxiv-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: nerxiv-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 13.4 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nerxiv-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 43d5fce34a81d7d2fc1b64b8613d31385a0e59d758b599ad9efa943de612a923
MD5 2233e5d5f195a44afb2e40a93052b338
BLAKE2b-256 b3a381002a94ee0bc50c24d373ed4833e2846f8277c1a4777c84afa17c7372cc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page