Skip to main content

A Python package for extracting structured metadata information from arXiv papers.

Project description

NERxiv logo

CI Coverage Status License: PolyForm NC 1.0.0 PyPI version Python versions

NERxiv

Named Entity Recognition for arxiv papers (NERxiv) is a Python wrapper tool for extracting structured metadata from scientific papers on arXiv using LLMs and modern retrieval-augmented generation (RAG) techniques.

Visit the documentation page to learn how to use this tool.

What It Does

  • Uses pyrxiv to fetch, download, and extract text from arXiv papers
  • Chunks and embeds text with SentenceTransformers or LangChain to categorize papers content using local LLMs (via Ollama)
  • Includes CLI tools and notebook tutorials for reproducible workflows

Installation

Install the core package:

pip install nerxiv

Running LLMs Locally

We recommend running your own models locally using Ollama:

# Install Ollama (follow instructions on their website)
ollama pull <model-name>   # e.g., llama3, deepseek-r1, qwen3:30b

# Start the local server
ollama serve

Development

To contribute to NERxiv or run it locally, follow these steps:

Clone the Repository

git clone https://github.com/JosePizarro3/NERxiv.git
cd NERxiv

Set Up a Virtual Environment

We recommend Python ≥ 3.10:

python3 -m venv .venv
source .venv/bin/activate

Install Dependencies

Use uv (faster than pip) to install the package in editable mode with dev and docu extras:

pip install --upgrade pip
pip install uv
uv pip install -e .[dev,docu]

Run tests

Use pytest with verbosity to run all tests:

python -m pytest -sv tests

To check code coverage:

python -m pytest --cov=nerxiv tests

Code formatting and linting

We use Ruff for formatting and linting (configured via pyproject.toml).

Check linting issues:

ruff check .

Auto-format code:

ruff format . --check

Manually fix anything Ruff cannot handle automatically.

Documentation writing

To view the documentation locally, make sure to have installed the extra [docu] packages:

uv pip install -e '[docu]'

Note: This command installs mkdocs, mkdocs-material, and other documentation-related dependencies.

The first time, build the server:

mkdocs build

Run the documentation server:

mkdocs serve

The output looks like:

INFO    -  Building documentation...
INFO    -  Cleaning site directory
INFO    -  [14:07:47] Watching paths for changes: 'docs', 'mkdocs.yml'
INFO    -  [14:07:47] Serving on http://127.0.0.1:8000/

Simply click on http://127.0.0.1:8000/. The changes in the md files of the documentation are immediately reflected when the files are saved (the local web will automatically refresh).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nerxiv-0.1.0.tar.gz (13.9 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nerxiv-0.1.0-py3-none-any.whl (13.4 MB view details)

Uploaded Python 3

File details

Details for the file nerxiv-0.1.0.tar.gz.

File metadata

  • Download URL: nerxiv-0.1.0.tar.gz
  • Upload date:
  • Size: 13.9 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nerxiv-0.1.0.tar.gz
Algorithm Hash digest
SHA256 019ad6f6d89a8d5acfd3f811167d58be7471dd174e05ef2eaca02510962c1f48
MD5 0725479b6a5d8cadf7049cd6c32104c0
BLAKE2b-256 9680c790e939a7b6a7a1c07d1646cee18a9a008af312d2c25847e6f9b0c0a726

See more details on using hashes here.

File details

Details for the file nerxiv-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: nerxiv-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 13.4 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nerxiv-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2150215b3e8caa593662c7bd7194e5192fdab05438800184a794bbba6381c978
MD5 25d52e74f663b2397323936f8eaa0e60
BLAKE2b-256 963017bc8f7c56dba25aaa59aded53bb482447467645d2aa33dd5d688cab78b0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page