Skip to main content

AI-powered fact extraction and citation mapping for documents (PDF, Word, web, text)

Project description

ai-citer

AI-powered fact extraction and citation mapping for documents — PDF, Word, web pages, and plain text.

Built on FastAPI + Anthropic Claude. Extracts verbatim-quoted facts from documents, maps each quote back to its exact character offset, and optionally assigns PDF page numbers.

Install

pip install ai-citer

Requires Python 3.11+ and a PostgreSQL database.

Quick start

Run as a standalone server

Set environment variables (or create a .env file):

ANTHROPIC_API_KEY=sk-ant-...
DATABASE_URL=postgresql://user:pass@localhost/ai_citer
ai-citer serve          # starts on :3001
ai-citer serve --port 8080 --reload

Or with uvicorn directly:

uvicorn app.main:app --port 3001

Embed the router in your own FastAPI app

from fastapi import FastAPI
from ai_citer import documents_router

app = FastAPI()
app.include_router(documents_router, prefix="/ai-citer")

Note: the router reads app.state.pool (asyncpg pool) and app.state.anthropic_client from the FastAPI app state. Use the lifespan from app.main as a reference, or set them up yourself.

Use the core functions directly

import anthropic
import asyncio
from ai_citer import (
    create_pool, init_db,
    extract_facts, map_citations, assign_page_numbers,
    parse_pdf, parse_word, parse_web, parse_text,
)

async def main():
    pool = await create_pool("postgresql://localhost/mydb")
    await init_db(pool)

    client = anthropic.AsyncAnthropic(api_key="sk-ant-...")

    # Parse a PDF
    with open("report.pdf", "rb") as f:
        content = parse_pdf(f.read())

    # Extract facts
    extraction, usage = await extract_facts(client, content.rawText)

    # Map quotes back to character offsets
    facts = map_citations(content.rawText, extraction.facts)
    print(facts[0].citations[0].charOffset)   # exact position in raw text
    print(f"Cost: ${usage.costUsd:.4f}")

asyncio.run(main())

REST API

When running as a server, the following endpoints are available under /api/documents:

Method Path Description
GET / List all documents
POST / Upload a file (multipart/form-data) or URL (url form field)
GET /:id Get a document (includes pdfData for PDFs)
POST /:id/extract Run fact extraction (optional { "prompt": "..." } body)
GET /:id/facts Get all accumulated facts for a document
POST /:id/chat Chat with a document ({ "message": "...", "history": [] })

MCP server

ai-citer ships an MCP server that exposes extraction tools to AI assistants (Claude Desktop, etc.):

ai-citer mcp

Tools: upload_document_url, extract_facts, get_facts, list_documents.

Environment variables

Variable Required Default Description
ANTHROPIC_API_KEY Yes Anthropic API key
DATABASE_URL Yes PostgreSQL connection string

Development

git clone https://github.com/czawora/ai-citer
cd ai-citer/server
pip install -e ".[dev]"
pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai_citer-1.0.0.tar.gz (22.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ai_citer-1.0.0-py3-none-any.whl (18.4 kB view details)

Uploaded Python 3

File details

Details for the file ai_citer-1.0.0.tar.gz.

File metadata

  • Download URL: ai_citer-1.0.0.tar.gz
  • Upload date:
  • Size: 22.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.2

File hashes

Hashes for ai_citer-1.0.0.tar.gz
Algorithm Hash digest
SHA256 354098dfac6320e596aac42852633a165bf643ff667213b107527cf1a85b5862
MD5 90399f85e0e69593a48c63827145761e
BLAKE2b-256 5c87fee701e495b367be44dfc477446382d48eb4a098a5acd827f66150e5f4a7

See more details on using hashes here.

File details

Details for the file ai_citer-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: ai_citer-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 18.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.2

File hashes

Hashes for ai_citer-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e643fb9354ded51804033e3454e34ebe4f8c43ec598e69518c76b1f7b22c981c
MD5 8dc5a22a33eeb4529a8737a8a21001d8
BLAKE2b-256 9628f95e11dfcb2219ebae99f6d22a0bf8e4f3a3384f7505f3eafa2224f54965

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page