Skip to main content

NMDC Submission portal metadata suggestor tool, powered by AI

Project description

nmdc-metadata-suggestor-ai-tool

A Python application for the NMDC Submission portal metadata suggestor tool, powered by AI. This project uses modern Python tooling with uv for dependency management and Docker for containerization.

Prerequisites

  • Python 3.12 or higher
  • uv (or use Docker)
  • Docker and Docker Compose (for containerized development)

Quick Start

Option 1: Using uv (Local Development)

  1. Install uv (if not already installed):

    curl -LsSf https://astral.sh/uv/install.sh | sh
    # or
    pip install uv
    
  2. Clone and setup:

    git clone https://github.com/microbiomedata/nmdc-metadata-suggestor-ai-tool.git
    cd nmdc-metadata-suggestor-ai-tool
    
  3. Install dependencies:

    uv sync
    
  4. Configure environment:

    cp .env.example .env
    # Edit .env and add your API keys
    
  5. Use the package in Python:

    uv run python
    
    from nmdc_metadata_suggestor_ai_tool.llm_client import LLMClient
    from nmdc_metadata_suggestor_ai_tool.recommendation_pipeline import run_recommendation_pipeline
    
    submission_object = {
        # NMDC submission JSON payload
    }
    
    client = LLMClient(access_provider="gcp")
    result = run_recommendation_pipeline(submission_object, client)
    print(result.model_dump())
    

Option 2: Using Docker

  1. Clone the repository:

    git clone https://github.com/microbiomedata/nmdc-metadata-suggestor-ai-tool.git
    cd nmdc-metadata-suggestor-ai-tool
    
  2. Configure environment:

    cp .env.example .env
    # Edit .env and add your API keys
    
  3. Run with Docker Compose (development):

    docker-compose up
    
  4. Or build and run production image:

    docker build -t nmdc-suggestor .
    docker run --env-file .env nmdc-suggestor
    

Development

Project Structure

nmdc-metadata-suggestor-ai-tool/
├── src/
│   └── nmdc_metadata_suggestor/
│       ├── __init__.py
│       ├── recommendation_pipeline.py       # Pipeline orchestration
│       ├── llm_client.py                    # LLM client for AI interactions
│       ├── cli/
│       │   ├── __init__.py
│       │   └── doi_cli.py                   # DOI operations CLI
│       ├── models/
│       │   ├── __init__.py
│       │   ├── doi.py                       # DOI data models
│       │   └── llm_output.py                # LLM output model
│       └── publication_ingestion/
│           ├── __init__.py
│           ├── download_pdf.py              # PDF retrieval logic
│           └── retreive_pdf_link.py         # PDF link discovery
├── tests/                                    # Test files
├── scripts/                                  # Vertex AI test scripts
├── docs/                                     # Documentation
├── pyproject.toml                            # Project dependencies and metadata
├── Dockerfile                                # Production Docker image
├── Dockerfile.dev                            # Development Docker image
├── docker-compose.yml                        # Docker Compose configuration
├── .env.example                              # Example environment variables
└── README.md                                 # This file

Running Tests

# Run all tests
uv run pytest

# Run with coverage
uv run pytest --cov=src/nmdc_metadata_suggestor

# Run specific test file
uv run pytest tests/test_example.py

Code Quality

# Format code with Black
uv run black src tests

# Lint with Ruff
uv run ruff check src tests

# Type check with MyPy
uv run mypy src

Adding Dependencies

# Add a production dependency
uv add package-name

# Add a development dependency
uv add --dev package-name

# Update dependencies
uv sync

Configuration

Configuration is managed through environment variables or a .env file. See .env.example for available options:

  • DEFAULT_MODEL: Default LLM model to use
  • MAX_TOKENS: Maximum tokens for LLM responses
  • TEMPERATURE: Temperature for LLM responses (0.0-1.0)

Docker Development Workflow

Interactive Development

For interactive development with hot-reload:

# Start container in background
docker-compose up -d

# Execute commands in the container
docker-compose exec app uv run pytest
docker-compose exec app uv run black src

# Access shell
docker-compose exec app bash

# Stop container
docker-compose down

Production Build

# Build production image
docker build -t nmdc-suggestor:latest .

# Run production container
docker run --env-file .env nmdc-suggestor:latest

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes
  4. Run tests and quality checks
  5. Commit your changes (git commit -m 'Add amazing feature')
  6. Push to the branch (git push origin feature/amazing-feature)
  7. Open a Pull Request

License

See LICENSE for licensing terms.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nmdc_metadata_suggestor_ai_tool-1.0.0.tar.gz (76.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nmdc_metadata_suggestor_ai_tool-1.0.0-py3-none-any.whl (62.1 kB view details)

Uploaded Python 3

File details

Details for the file nmdc_metadata_suggestor_ai_tool-1.0.0.tar.gz.

File metadata

File hashes

Hashes for nmdc_metadata_suggestor_ai_tool-1.0.0.tar.gz
Algorithm Hash digest
SHA256 a568acb15fd8f61cd3dde925c959c14be01d0e9de1b2f78987444eb63e7b1cee
MD5 9becfb19dc573ff5b812341fc81e2cfc
BLAKE2b-256 5916c6b7d43cfad9fb164f1607c90a03adb93a39b9560b13ff97c0233a8e0e7c

See more details on using hashes here.

File details

Details for the file nmdc_metadata_suggestor_ai_tool-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for nmdc_metadata_suggestor_ai_tool-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c573947eba499778d4e7bb9b1c7cf03605876dbab5ddfb68cd0b76fd1f7db2c8
MD5 8da0eb68122d177039c1d8d99cf8afc5
BLAKE2b-256 d672649ec7ff7a24d8c6dba55b24d523dedf1733057a9d4f0b4239ec6c5795d9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page