Skip to main content

A Python package for extracting structured metadata information from arXiv papers.

Project description

NERxiv logo

CI Coverage Status License: PolyForm NC 1.0.0 PyPI version Python versions

NERxiv

Named Entity Recognition for arxiv papers (NERxiv) is a Python wrapper tool for extracting structured metadata from scientific papers on arXiv using LLMs and modern retrieval-augmented generation (RAG) techniques.

Visit the documentation page to learn how to use this tool.

What It Does

  • Uses pyrxiv to fetch, download, and extract text from arXiv papers
  • Chunks and embeds text with SentenceTransformers or LangChain to categorize papers content using local LLMs (via Ollama)
  • Includes CLI tools and notebook tutorials for reproducible workflows

Installation

Install the core package:

pip install nerxiv

Running LLMs Locally

We recommend running your own models locally using Ollama:

# Install Ollama (follow instructions on their website)
ollama pull <model-name>   # e.g., llama3, deepseek-r1, qwen3:30b

# Start the local server
ollama serve

Development

To contribute to NERxiv or run it locally, follow these steps:

Clone the Repository

git clone https://github.com/JosePizarro3/NERxiv.git
cd NERxiv

Set Up a Virtual Environment

We recommend Python ≥ 3.10:

python3 -m venv .venv
source .venv/bin/activate

Install Dependencies

Use uv (faster than pip) to install the package in editable mode with dev and docu extras:

pip install --upgrade pip
pip install uv
uv pip install -e .[dev,docu]

Run tests

Use pytest with verbosity to run all tests:

python -m pytest -sv tests

To check code coverage:

python -m pytest --cov=nerxiv tests

Code formatting and linting

We use Ruff for formatting and linting (configured via pyproject.toml).

Check linting issues:

ruff check .

Auto-format code:

ruff format . --check

Manually fix anything Ruff cannot handle automatically.

Documentation writing

To view the documentation locally, make sure to have installed the extra [docu] packages:

uv pip install -e '[docu]'

Note: This command installs mkdocs, mkdocs-material, and other documentation-related dependencies.

The first time, build the server:

mkdocs build

Run the documentation server:

mkdocs serve

The output looks like:

INFO    -  Building documentation...
INFO    -  Cleaning site directory
INFO    -  [14:07:47] Watching paths for changes: 'docs', 'mkdocs.yml'
INFO    -  [14:07:47] Serving on http://127.0.0.1:8000/

Simply click on http://127.0.0.1:8000/. The changes in the md files of the documentation are immediately reflected when the files are saved (the local web will automatically refresh).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nerxiv-0.0.1.tar.gz (13.7 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nerxiv-0.0.1-py3-none-any.whl (13.4 MB view details)

Uploaded Python 3

File details

Details for the file nerxiv-0.0.1.tar.gz.

File metadata

  • Download URL: nerxiv-0.0.1.tar.gz
  • Upload date:
  • Size: 13.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nerxiv-0.0.1.tar.gz
Algorithm Hash digest
SHA256 5c85df44c9a988f5e5d47b420ca78318cb03e8cbedf3ec44086aa0707e87e873
MD5 b00a57395f9d81b9daaca44bb76f693c
BLAKE2b-256 e997c741dbe2c1f42d3230c2139a80024ae4d49c855ee1b4b769876c56b5d585

See more details on using hashes here.

File details

Details for the file nerxiv-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: nerxiv-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 13.4 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nerxiv-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d4802a85086fcf637bdde0b6ac3608e5990b735a25d2d20f1b7c077cf50bfe3d
MD5 8219ae4b047100e41b61f2def7a33c73
BLAKE2b-256 2871e3a88235b1074e313d4f8ead77e0905518f2a653004c350c31174f463a18

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page