Skip to main content

BioDeepagents CLI and skill bundles.

Project description

BioDeepagents CLI

Skills

Skills are sourced from BioClaw. The following skills are sourced from claude-scientific-skills:

  • Bioinformatics & Genomics
  • Sequence Analysis: Process DNA/RNA/protein sequences with BioPython and pysam
  • Single-Cell Analysis: Analyze 10X Genomics data with Scanpy, identify cell types, infer GRNs with Arboreto
  • Variant Annotation: Annotate VCF files with Ensembl VEP, query ClinVar for pathogenicity
  • Variant Database Management: Build scalable VCF databases with TileDB-VCF for incremental sample addition, efficient population-scale queries, and compressed storage of genomic variant data
  • Gene Discovery: Query NCBI Gene, UniProt, and Ensembl for comprehensive gene information
  • Network Analysis: Identify protein-protein interactions via STRING, map to pathways (KEGG, Reactome)
Skill Description
bio-orchestrator Meta-agent that routes bioinformatics requests to specialised sub-skills. Handles file type detection, analysis planning, report generation, and reproducibility export.
claw-ancestry-pca Ancestry decomposition PCA against the Simons Genome Diversity Project.
claw-metagenomics Shotgun metagenomics profiling — taxonomy, resistome, and functional pathways.
claw-semantic-sim Semantic Similarity Index for disease research literature using PubMedBERT embeddings.
drug-photo Photo of a medication to get pharmacogenomic dosage guidance via a CPIC single-drug lookup.
equity-scorer Compute HEIM diversity and equity metrics from VCF or ancestry data. Generates heterozygosity, FST, PCA plots, and a composite HEIM Equity Score with markdown reports.
genome-compare Compare your genome to George Church (PGP-1) and estimate ancestry composition.
labstep Interact with the Labstep electronic lab notebook API using labstepPy. Query experiments, protocols, resources, inventory, and other lab entities.
lit-synthesizer Search PubMed and bioRxiv, summarise papers with LLM, build citation graphs, and generate literature review sections.
nutrigx-advisor Personalised nutrition report from genetic data (23andMe, AncestryDNA, or VCF).
pharmgx-reporter Pharmacogenomic report from DTC genetic data (23andMe/AncestryDNA).
repro-enforcer Export any bioinformatics analysis as a reproducible bundle with Conda environment, Singularity container definition, and Nextflow pipeline.
scrna-orchestrator Automate single-cell RNA-seq analysis with Scanpy or Seurat. QC, normalisation, clustering, DE analysis, and visualisation.
seq-wrangler Sequence QC, alignment, and BAM processing. Wraps FastQC, BWA/Bowtie2, SAMtools for automated read-to-BAM pipelines.
struct-predictor Local protein structure prediction with AlphaFold, Boltz, or Chai. Compare predicted structures, compute RMSD, visualise 3D models.
vcf-annotator Annotate VCF variants with VEP, ClinVar, gnomAD frequencies, and ancestry-aware context. Generates prioritised variant reports.

This project is basing on deepagents CLI, an open source coding assistant that runs in your terminal, similar to Claude Code.

Key Features:*

  • Built-in Tools: File operations (read, write, edit, glob, grep), shell commands, web search, and subagent delegation
  • Customizable Skills: Add domain-specific capabilities through a progressive disclosure skill system
  • Persistent Memory: Agent remembers your preferences, coding style, and project context across sessions
  • Project-Aware: Automatically detects project roots and loads project-specific configurations
deep agent

🐳 Docker

Build the CLI image from the repository root so both the CLI sources and the shared deepagents package are available to Docker:

This project is a docker-compose hosted a deepagents-cli, and inside container some other agents(e.g. dsl) were also hosted.

docker compose run --rm --service-ports --build flask-app

Copy the sample configuration into place and edit the credentials or tracing toggles you need:

docker run --rm -it -P \
  -v "$(pwd)/workspace/:/workspace/project" \
  --env-file ./.env \
  deepagents-cli

Type naturally as you would in a chat interface. The agent will use its built-in tools, skills, and memory to help you with tasks.

[!WARNING] Human-in-the-Loop (HITL) Approval Required

Potentially destructive operations require user approval before execution:

  • File operations: write_file, edit_file
  • Command execution: shell, execute
  • External requests: web_search, fetch_url
  • Delegation: task (subagents)

Each operation will prompt for approval showing the action details. Use --auto-approve to skip prompts:

deepagents --auto-approve

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

biodeep-0.1.0.tar.gz (18.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

biodeep-0.1.0-py3-none-any.whl (14.5 kB view details)

Uploaded Python 3

File details

Details for the file biodeep-0.1.0.tar.gz.

File metadata

  • Download URL: biodeep-0.1.0.tar.gz
  • Upload date:
  • Size: 18.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.12

File hashes

Hashes for biodeep-0.1.0.tar.gz
Algorithm Hash digest
SHA256 9e6469cc1a013a073169acc945ba651f7333681fd34a0dd6c2cb6303cce3c3f1
MD5 89b354c7f58ee75a4000fcc8c6a7c7f5
BLAKE2b-256 3b89f6c995343920f1c0eaf772a68206198854de9c77294aa826228b814a08d8

See more details on using hashes here.

File details

Details for the file biodeep-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: biodeep-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 14.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.12

File hashes

Hashes for biodeep-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fd6e9dab51afd58928882a762b4b9f1a66bb4d7b600560cc73345bb5a5295372
MD5 dc9e85a1a360d2e8d79058c1d65d156e
BLAKE2b-256 0fd5f0f37212109aed5201dea375a5512fca82fe020ca9f7e17ad0c51d3d0884

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page