A package for running neurosynth-compose analyses
Project description
compose-runner
Python package to execute meta-analyses created using neurosynth compose and NiMARE as the meta-analysis execution engine.
AWS Deployment
This repository includes an AWS CDK application that turns compose-runner into a serverless batch pipeline using Step Functions, AWS Lambda, and ECS Fargate. The deployed architecture works like this:
ComposeRunnerSubmit(Lambda Function URL) accepts HTTP requests, validates the meta-analysis payload, and starts a Step Functions execution. The response is immediate and returns both a durablejob_id(the execution ARN) and theartifact_prefixused for S3 and log correlation.- A Standard state machine runs a single Fargate task (
compose_runner.ecs_task) and waits for completion. The container downloads inputs, executes the meta-analysis on up to 4 vCPU / 30 GiB of memory, uploads artifacts to S3, and writesmetadata.jsoninto the same prefix. ComposeRunnerStatus(Lambda Function URL) wrapsDescribeExecution, merges metadata from S3, and exposes a simple status endpoint suitable for polling.ComposeRunnerLogPollerstreams the ECS CloudWatch Logs for a givenartifact_prefix, whileComposeRunnerResultsFetcherreturns presigned URLs for stored artifacts.
- Create a virtual environment and install the CDK dependencies:
cd infra/cdk python -m venv .venv source .venv/bin/activate pip install -r requirements.txt
- (One-time per account/region) bootstrap the CDK environment:
cdk bootstrap - Deploy the stack (supplying the compose-runner version you want baked into the images):
cdk deploy \ -c composeRunnerVersion=$(hatch version) \ -c resultsPrefix=compose-runner/results \ -c taskCpu=4096 \ -c taskMemoryMiB=30720
Pass-c resultsBucketName=<bucket>to use an existing S3 bucket, or omit it to let the stack create and retain a dedicated bucket. Additional knobs:
-c stateMachineTimeoutSeconds=32400to control the max wall clock per run-c submitTimeoutSeconds/-c statusTimeoutSeconds/-c pollTimeoutSecondsto tune Lambda timeouts-c taskEphemeralStorageGiBif the default 21 GiB scratch volume is insufficient
The deployment builds both the Lambda image (aws_lambda/Dockerfile) and the
Fargate task image (Dockerfile), provisions the Step Functions state machine,
and configures a public VPC so each task has outbound internet access.
The CloudFormation outputs list the HTTPS endpoints for submission, status,
logs, and artifact retrieval, alongside the Step Functions ARN.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file compose_runner-0.7.3.tar.gz.
File metadata
- Download URL: compose_runner-0.7.3.tar.gz
- Upload date:
- Size: 13.4 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
be23bcf8af31ed1bfe2b4e464b6cdcf76ac96aa9dc32d68f0fe11cb5a4c153ba
|
|
| MD5 |
e5816eb3a2aa483da4956017cb7abeff
|
|
| BLAKE2b-256 |
bd745f446f18d48c5c9c3c02be9db425b74729bb3de70f306ba37e5cbb03164e
|
File details
Details for the file compose_runner-0.7.3-py2.py3-none-any.whl.
File metadata
- Download URL: compose_runner-0.7.3-py2.py3-none-any.whl
- Upload date:
- Size: 13.4 MB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7547ec2c58e230efeb4b7fc41725f97685e7db5584fa59d5a0e6ebc1e24e1868
|
|
| MD5 |
7efc14ecb016c896d37c0e7a22497645
|
|
| BLAKE2b-256 |
e9b40d08cc642df121a13fe7e420570366104e2ea534d855ba5f89d8ee981619
|