Skip to main content

NMDC Submission portal metadata suggestor tool, powered by AI

Project description

nmdc-metadata-suggestor-ai-tool

A Python application for the NMDC Submission portal metadata suggestor tool, powered by AI. This project uses modern Python tooling with uv for dependency management and Docker for containerization.

Prerequisites

  • Python 3.12 or higher
  • uv (or use Docker)
  • Docker and Docker Compose (for containerized development)

Quick Start

LLM Configuration:

You will need to set up a .env file. Copy the example first:

cp .env-example .env

Environment variables used by LLMClient and ConversationManager:

  • AI_INCUBATOR_KEY: API key for PNNL AI Incubator (when using access_provider=pnnl).
  • AI_INCUBATOR_BASE_URL: Base URL for the PNNL AI Incubator API.
  • GOOGLE_APPLICATION_CREDENTIALS: Path to a GCP service account JSON file (for Vertex AI).
  • VERTEX_PROJECT_ID: (Optional) GCP project id for Vertex. If not provided, the SDK will attempt to infer it from credentials.
  • GEMINI_REGION: (Optional) GCP region for Gemini/Vertex (defaults to us-east5 or CLOUD_ML_REGION).
  • CBORG_KEY: API key for CBORG (when using access_provider=cborg).
  • CBORG_BASE_URL: Base URL for the CBORG API.

The LLMClient will read the appropriate variables depending on access_provider (set to pnnl, cborg, or gcp).

Environment variables are loaded from a .env file in the project root via python-dotenv. Variables already set in your shell take precedence over .env values (override=False is the default).

Option 1: Using uv (Local Development)

  1. Install uv (if not already installed):

    curl -LsSf https://astral.sh/uv/install.sh | sh
    # or
    pip install uv
    
  2. Clone and setup:

    git clone https://github.com/microbiomedata/nmdc-metadata-suggestor-ai-tool.git
    cd nmdc-metadata-suggestor-ai-tool
    
  3. Install dependencies:

    uv sync
    
  4. Configure environment:

    cp .env.example .env
    # Edit .env and add your API keys
    
  5. Use the package in Python:

    uv run python
    
    from nmdc_metadata_suggestor_ai_tool.llm_client import LLMClient
    from nmdc_metadata_suggestor_ai_tool.recommendation_pipeline import run_recommendation_pipeline
    
    submission_object = {
        # NMDC submission JSON payload
    }
    
    client = LLMClient(access_provider="gcp")
    result = run_recommendation_pipeline(submission_object, client)
    print(result.model_dump())
    

Advanced: direct ConversationManager usage (optional)

from nmdc_metadata_suggestor_ai_tool.llm_client import LLMClient, ConversationManager

client = LLMClient(access_provider="gcp")
conversation = ConversationManager(llm_client=client)
# Add plain text context (pdf_files may be a list of local PDF paths)
conversation.add_message(text="Please summarize the submission.", pdf_files=None)
# Add any schema context to guide the model
conversation.add_schema_context("<schema description here>")
response = conversation.generate(model="gemini-2.5-flash", max_tokens=1024, gemini_temperature=0.2)
print(response)

Option 2: Using Docker

  1. Clone the repository:

    git clone https://github.com/microbiomedata/nmdc-metadata-suggestor-ai-tool.git
    cd nmdc-metadata-suggestor-ai-tool
    
  2. Configure environment:

    cp .env.example .env
    # Edit .env and add your API keys
    
  3. Run with Docker Compose (development):

    docker-compose up
    
  4. Or build and run production image:

    docker build -t nmdc-suggestor .
    docker run --env-file .env nmdc-suggestor
    

Development

Project Structure

nmdc-metadata-suggestor-ai-tool/
├── src/
│   └── nmdc_metadata_suggestor_ai_tool/
│       ├── __init__.py
│       ├── recommendation_pipeline.py       # Pipeline orchestration
│       ├── llm_client.py                    # LLM client for AI interactions
│       ├── cli/
│       │   ├── __init__.py
│       │   └── doi_cli.py                   # DOI operations CLI
│       ├── models/
│       │   ├── __init__.py
│       │   ├── doi.py                       # DOI data models
│       │   └── llm_output.py                # LLM output model
│       └── publication_ingestion/
│           ├── __init__.py
│           ├── download_pdf.py              # PDF retrieval logic
│           └── retreive_pdf_link.py         # PDF link discovery
├── tests/                                    # Test files
├── scripts/                                  # Vertex AI test scripts
├── docs/                                     # Documentation
├── pyproject.toml                            # Project dependencies and metadata
├── Dockerfile                                # Production Docker image
├── Dockerfile.dev                            # Development Docker image
├── docker-compose.yml                        # Docker Compose configuration
├── .env.example                              # Example environment variables
└── README.md                                 # This file

Running Tests

# Run all tests
uv run pytest

# Run with coverage
uv run pytest --cov=src/nmdc_metadata_suggestor_ai_tool

# Run specific test file
uv run pytest tests/test_example.py

Code Quality

# Format code with Ruff
uv run ruff format

# Lint with Ruff
uv run ruff check

# Type check with MyPy
uv run mypy src

Adding Dependencies

# Add a production dependency
uv add package-name

# Add a development dependency
uv add --dev package-name

# Update dependencies
uv sync

Configuration

Configuration is managed through environment variables or a .env file. See .env.example for available options:

  • DEFAULT_MODEL: Default LLM model to use
  • MAX_TOKENS: Maximum tokens for LLM responses
  • TEMPERATURE: Temperature for LLM responses (0.0-1.0)

Docker Development Workflow

Interactive Development

For interactive development with hot-reload:

# Start container in background
docker-compose up -d

# Execute commands in the container
docker-compose exec app uv run pytest
docker-compose exec app uv run ruff format

# Access shell
docker-compose exec app bash

# Stop container
docker-compose down

Production Build

# Build production image
docker build -t nmdc-suggestor:latest .

# Run production container
docker run --env-file .env nmdc-suggestor:latest

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes
  4. Run tests and quality checks
  5. Commit your changes (git commit -m 'Add amazing feature')
  6. Push to the branch (git push origin feature/amazing-feature)
  7. Open a Pull Request

License

See LICENSE for licensing terms.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nmdc_metadata_suggestor_ai_tool-1.2.0.tar.gz (105.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file nmdc_metadata_suggestor_ai_tool-1.2.0.tar.gz.

File metadata

File hashes

Hashes for nmdc_metadata_suggestor_ai_tool-1.2.0.tar.gz
Algorithm Hash digest
SHA256 77979409b8247280e5be5e8466aa6e4ad8add132daf1517639b09f1d2dbbd808
MD5 dc8b5e8f0f7df8e3a9c80902d53ed47f
BLAKE2b-256 d4433c176e92d0cd414f503a1a61f539d1953f6271d55ce298a30ff844a37bdd

See more details on using hashes here.

Provenance

The following attestation bundles were made for nmdc_metadata_suggestor_ai_tool-1.2.0.tar.gz:

Publisher: publish.yml on microbiomedata/nmdc-metadata-suggestor-ai-tool

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file nmdc_metadata_suggestor_ai_tool-1.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for nmdc_metadata_suggestor_ai_tool-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 693c28bcc4573af94f499492cf83bd01eb95706497949e386b0b45e71d6a68a2
MD5 281de705b89fbf55db6ac5723522237b
BLAKE2b-256 e8885ff7d9d45d5dd1d90cbf828cd405542902f4d3744d8cb89d46d38427b997

See more details on using hashes here.

Provenance

The following attestation bundles were made for nmdc_metadata_suggestor_ai_tool-1.2.0-py3-none-any.whl:

Publisher: publish.yml on microbiomedata/nmdc-metadata-suggestor-ai-tool

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page