Skip to main content

REST service to retrive metadata from databases.

Project description

aind-metadata-service

License Code Style

REST service to retrieve metadata from various AIND databases hosted at http://aind-metadata-service/ . Spins up query-able endpoints for each database to fetch metadata.

User Installation & Usage

These instructions are for users at the Allen Institute using the existing deployment.

Using the HTTP API

To programatically interact with the HTTP API, you can use the requests library. For example, to fetch procedures for a particular subject_id:

import requests

subject_id = "000000"
url = f"http://aind-metadata-service/procedures/{subject_id}"
response = requests.get(url)
response.raise_for_status()
rj = response.json()

data = rj.get("data")
message = rj.get("message")

This can be done with any of the HTTP endpoints. More info about available endpoints and acceptable queries can be found here

Using the Metadata Service API Client

The client provides a simple interface to the API. It can be installed with pip:

pip install "aind-metadata-service[client]"

Once installed, you can use the provided AindMetadataServiceClient to fetch metadata.

from aind_metadata_service.client import AindMetadataServiceClient

# Initialize client with the server domain
# If you're at the Allen Institute, use one of these domains:
client = AindMetadataServiceClient(domain="http://aind-metadata-service")  # production
# client = AindMetadataServiceClient(domain="http://aind-metadata-service-dev")  # development

# Subject and procedures
subject_data = client.get_subject("775745").json()
procedures_data = client.get_procedures("775745").json()

# Intended measurements and other data
measurements = client.get_intended_measurements("775745").json()
injection_materials = client.get_injection_materials("VT3214G").json()
ecephys_sessions = client.get_ecephys_sessions("775745").json()
perfusions = client.get_perfusions("775745").json()

# Protocol and funding information 
protocol_info = client.get_protocols("Protocol-123").json()
funding_info = client.get_funding("Project-ABC").json()
project_names = client.get_project_names().json()

# SLIMS data
imaging_data = client.get_smartspim_imaging(
    subject_id="775745",
    start_date_gte="2023-01-01",
    end_date_lte="2023-12-31"
).json()

histology_data = client.get_histology(subject_id="775745").json()

Deployment

Install the server to host your own metadata-service, whether locally for development or in a production environment.

Server Installation

The server can be pip installed using pip install "aind-metadata-service[server]".

Installing pyodbc.

#10 23.69 Err:1 http://deb.debian.org/debian bullseye/main amd64 libodbc1 amd64 2.3.6-0.1+b1
#10 23.69   Could not connect to debian.map.fastlydns.net:80 (146.75.42.132). - connect (111: Connection refused) Unable to connect to deb.debian.org:http:

Development

Development dependencies can be pip installed using pip install "aind-metadata-service[dev]". Once the development environment is setup, use docker to run locally.

Running Locally with Docker

1. Build the container

docker build . -t aind-metadata-service-local:latest

2. Using AWS Credentials from Local Machine

If your AWS credentials are already configured on your machine (~/.aws/credentials on Linux/macOS or %USERPROFILE%\.aws\credentials on Windows), you can mount your credentials directly into the container:

  1. Run the container with AWS credentials mounted:
docker run -v ~/.aws:/root/.aws -e AWS_PROFILE={profile} -e AWS_PARAM_STORE_NAME={param name} -p 58350:58350 -p 5000:5000 aind-metadata-service-local:latest

This allows the container to use your locally configured AWS credentials without needing to pass them explicitly.

This will start the service on port 5000. You can access it at:

http://localhost:5000
  1. If you run into errors reading from your aws configurations, explicitly set the region and aws config file path:
MSYS_NO_PATHCONV=1 docker run -v {aws_config_file_path} -e AWS_PROFILE={profile} -e AWS_PARAM_STORE_NAME={param name} -e AWS_DEFAULT_REGION={region} -p 58350:58350 -p 5000:5000 aind-metadata-service-local:latest
  1. If your aws configurations are not setup, you can request credentials.

3. Using Environment File

You can also run the container with credentials defined in a .env file. Check the .env.template for required variables.

docker run -it -p 58350:58350 -p 5000:5000 --env-file=.env aind-metadata-service-local

Contributing

Linters and testing

There are several libraries used to run linters, check documentation, and run tests.

The following checks are enforced through GitHub Actions CI:

  • flake8 to check that code is up to standards (no unused imports, etc.)
  • interrogate to check documentation coverage
  • coverage for 100% test coverage requirement

Checks should ideally be run locally before pushing to github:

Test your changes using the coverage library, which will run the tests and log a coverage report:

coverage run -m unittest discover && coverage report

Use interrogate to check that modules, methods, etc. have been documented thoroughly:

interrogate --verbose .

Use flake8 to check that code is up to standards (no unused imports, etc.):

flake8 .

Additional recommended but optional tools:

Use black to automatically format the code into PEP standards:

black .

Use isort to automatically sort import statements:

isort .

Optional: Pre-commit Hooks

To automatically run style checks before each commit (recommended but optional):

# Install pre-commit (already included in dev dependencies)
pre-commit install

The hooks will run automatically on each commit, but can be bypassed with git commit --no-verify if needed.

Code style specifications:

  • Line length: 79 characters
  • Python version: 3.10+
  • Style guide: PEP 8 (enforced by flake8)

Pull requests

For internal members, please create a branch. For external members, please fork the repo and open a pull request from the fork. We'll primarily use Angular style for commit messages. Roughly, they should follow the pattern:

<type>(<scope>): <short summary>

where scope (optional) describes the packages affected by the code changes and type (mandatory) is one of:

  • build: Changes that affect the build system or external dependencies (example scopes: pyproject.toml, setup.py)
  • ci: Changes to our CI configuration files and scripts (examples: .github/workflows/ci.yml)
  • docs: Documentation only changes
  • feat: A new feature
  • fix: A bug fix
  • perf: A code change that improves performance
  • refactor: A code change that neither fixes a bug nor adds a feature
  • test: Adding missing tests or correcting existing tests

Documentation

To generate the rst files source files for documentation, run

sphinx-apidoc -o doc_template/source/ src

Then to create the documentation html files, run

sphinx-build -b html doc_template/source/ doc_template/build/html

More info on sphinx installation can be found here: https://www.sphinx-doc.org/en/master/usage/installation.html

API Response Codes

There are 6 possible status code responses for aind-metadata-service:

  • 200: successfully retrieved valid data without any problems.
  • 406: successfully retrieved some data, but failed to validate against pydantic models.
  • 404: found no data that matches query.
  • 300: queried the server, but more items were returned than expected.
  • 503: failed to connect to labtracks/sharepoint servers.
  • 500: successfully connected to labtracks/sharepoint, but some other server error occurred. These status codes are defined in StatusCodes enum in response_handler.py

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aind_metadata_service-0.24.0.tar.gz (256.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aind_metadata_service-0.24.0-py3-none-any.whl (176.3 kB view details)

Uploaded Python 3

File details

Details for the file aind_metadata_service-0.24.0.tar.gz.

File metadata

  • Download URL: aind_metadata_service-0.24.0.tar.gz
  • Upload date:
  • Size: 256.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for aind_metadata_service-0.24.0.tar.gz
Algorithm Hash digest
SHA256 b24105d8a64553d883b3ff989cc00ec4f8836085babe3a6200d97b5def9a7251
MD5 0c3f4bf9c94c8e4de0beed7e7606a838
BLAKE2b-256 3468853b7880da991791239b820c30135214ee847a761bfd4c441fcc48634be5

See more details on using hashes here.

File details

Details for the file aind_metadata_service-0.24.0-py3-none-any.whl.

File metadata

File hashes

Hashes for aind_metadata_service-0.24.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a57d16fe96810ef08cf4c8f4cd6b5a799f4a27d4a5afc3e2cd4c2c99ade6bc9e
MD5 cb254c2bda26b7ce6365691c59643172
BLAKE2b-256 a4760c86372a63a80fbf2d0d60041b67f8172de4182c535e6450920044d6e569

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page