Distributed toolsets for pantheon-agents, provide service via magique message transfer server.
Project description
Work in progress
Toolsets
- Python Interpreter
- R Interpreter
- Web toolset ()
- Web fetch with link resolution
- Web search using DDGS (DuckDuckGo)
- Web browse (legacy)
- Duckduckgo search
- Crawl4ai
- Code search ()
- File pattern search (glob)
- Content search (grep)
- Directory listing (ls)
- Notebook toolset (Jupyter)
- Read, edit, create notebooks
- Cell management (add, delete, move)
- Professional templates
- ScraperAPI
- Google search
- Web crawl
- Shell
- Convert toolset to MCP(Model Context Protocol)
- File editor/Filesystem access ()
- File transfer
- RAG system
- Todo/Task management (Claude Code style)
- Code search (glob, grep, ls)
- Jupyter notebook editing
- LaTeX compiler
- Browser-use
Installation
git clone https://github.com/aristoteleo/magique-ai.git
cd magique-ai
pip install -e ".[dev]"
Additional Dependencies
For enhanced web functionality:
pip install ddgs # For reliable web search
pip install beautifulsoup4 # For better HTML parsing
For notebook functionality:
pip install nbformat # For Jupyter notebook handling
All toolsets use Rich for beautiful console output, which is included in the base installation.
Usage
Built-in toolsets:
| Toolset | Package path | Description |
|---|---|---|
| Python Interpreter | pantheon.toolsets.python |
Run Python code in an interpreter. |
| R Interpreter | pantheon.toolsets.r |
Run R code in an interpreter. |
| Shell | pantheon.toolsets.shell |
Run shell commands. |
| Web Toolset | pantheon.toolsets.web |
** web fetch and search with DDGS.** |
| Domain Research | pantheon.toolsets.domain_research |
OmicVerse-backed domain research with demo fallback. |
| Code Search | pantheon.toolsets.code_search |
** file search with glob, grep, ls.** |
| Notebook Toolset | pantheon.toolsets.notebook |
Edit and manage Jupyter notebooks (no execution). |
| File Editor | pantheon.toolsets.file_editor |
** file editing with diffs.** |
| File Manager | pantheon.toolsets.file_manager |
General filesystem access and management. |
| Web Browse (legacy) | pantheon.toolsets.web_browse |
Legacy web search using DuckDuckGo and Crawl4ai. |
| ScraperAPI | pantheon.toolsets.scraper |
Use ScraperAPI to perform google search and web crawl. |
| Vector RAG | pantheon.toolsets.vector_rag |
Query a vector based RAG database. |
| File Transfer | pantheon.toolsets.file_transfer |
Transfer files between different systems. |
| Todo Management | pantheon.toolsets.todo |
Claude Code-style task management with status tracking. |
| Endpoint Hub | pantheon.toolsets.endpoint |
Manage and coordinate multiple toolset endpoints. |
Toolsets
The following toolsets provide Claude Code-like functionality for enhanced developer experience:
Web Toolset (pantheon.toolsets.web)
- web_fetch: Fetch web content with intelligent link resolution and markdown conversion
- web_search: Search using DDGS (DuckDuckGo Search) library - reliable and fast
Code Search Toolset (pantheon.toolsets.code_search)
- glob: Find files by patterns (e.g.,
*.py,**/*.ts) - grep: Search content across files with regex support
- ls: List directory contents with detailed information
Notebook Toolset (pantheon.toolsets.notebook)
- read_notebook: Display notebook contents with beautiful formatting
- edit_notebook_cell: Edit specific cells (code/markdown)
- add_notebook_cell: Add new cells at specific positions
- create_notebook: Create new notebooks with professional templates
File Editor Toolset (pantheon.toolsets.file_editor)
- read_file: Read files with line numbers
- edit_file: Edit files with diff display
- write_file: Create new files
Todo Management Toolset (pantheon.toolsets.todo)
- add_todo: Add new todo items with status tracking
- show_todos: Display todos in Claude Code style with checkboxes
- update_todo_status: Change status (pending ☐, in_progress ◐, completed ☑)
- complete_todo: Mark todos as completed
- start_todo: Mark todos as in progress
- remove_todo: Delete todos
- clear_completed_todos: Remove all completed tasks
Start a toolset, for example, the python interpreter from the command line:
python -m pantheon.toolsets.python
Example usage of new toolsets:
# Start web toolset for web fetch and search
python -m pantheon.toolsets.web
# Start code search toolset for file operations
python -m pantheon.toolsets.code_search
# Start notebook toolset for Jupyter editing
python -m pantheon.toolsets.notebook
See help with:
python -m pantheon.toolsets.python -- --help
NAME
__main__.py
SYNOPSIS
__main__.py <flags>
FLAGS
-s, --service_name=SERVICE_NAME
Type: str
Default: 'python-interpreter'
--mcp=MCP
Type: bool
Default: False
--mcp_kwargs=MCP_KWARGS
Type: dict
Default: {}
-t, --toolset_kwargs=TOOLSET_KWARGS
Type: dict
Default: {}
Development
Project structure:
- Built-in Toolsets
Test the package
Please start a magique message transfer server first.
python -m magique.server
Then export the server url and run the test:
export MAGIQUE_SERVER_URL=ws://localhost:8765/ws
pytest -s tests/
Environment configration
Firstly, you need docker and buildx installed. See docker docs and buildx docs for installation.
Magique-ai's built-in environments are stored in the environments folder.
And all environments could be managed by the environment/build_images.py script:
$ python environment/build_images.py -h
usage: build_images.py [-h] [-a] [-l] [-b TARGET] [--registry REGISTRY_PATH] [--push]
Docker image build automation
options:
-h, --help show this help message and exit
-a, --all Build all detected images
-l, --list List available Docker configurations
-b TARGET, --build TARGET
Build specific image by target name
--registry REGISTRY_PATH
Specify Docker registry path (e.g., ghcr.io/username)
--push Push the image(s) to the specified registry after building
Domain Research Toolset (pantheon.toolsets.domain_research)
- run_research: Generate a concise, cited report for a query.
- Backends:
demo(offline),web,web:duckduckgo,web:tavily,web:embed[:backend] - Optional LLM synthesis: set
synth=Trueand providemodel,base_url,api_key - Optional LLM scoping:
llm_scope=True(uses OpenAI-compatible API if available)
- Backends:
Start the toolset service:
python -m pantheon.toolsets.domain_research
Invoke via RPC (example):
from magique.client import connect_remote
import asyncio
async def demo():
s = await connect_remote("domain_research")
resp = await s.invoke("run_research", {
"query": "PBMC annotation best practices",
"backend": "demo", # or 'web', 'web:duckduckgo', 'web:embed', ...
"fmt": "markdown"
})
print(resp["report"]) # Markdown or HTML
asyncio.run(demo())
With OmicVerse and live web + LLM synthesis (optional):
resp = await s.invoke("run_research", {
"query": "Recent advances in scRNA-seq batch correction",
"backend": "web", # auto-selects Tavily if TAVILY_API_KEY set else DuckDuckGo
"synth": True,
"model": "gpt-5",
"base_url": "https://api.openai.com/v1",
"api_key": "<OPENAI_API_KEY>",
"llm_scope": True
})
Verification and enrichment (optional):
```bash
# Enable verification and network checks
export OV_DR_VERIFY_NETWORK=1
# Enable metadata enrichment from Crossref (optional mailto for User-Agent)
export OV_DR_ENRICH=1
export CROSSREF_MAILTO="you@example.com" # polite usage per Crossref guidelines
# Enable PubMed enrichment (requires key)
export NCBI_API_KEY=xxxx
Pantheon-CLI integration: the CLI now loads this toolset by default. Launch:
```bash
python -m pantheon_cli.cli --disable_web False --disable_dr False
Then ask the assistant to run domain research (it can call run_research), for example:
"Research PBMC annotation best practices using live web and cite sources"
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pantheon_toolsets-0.5.4-py3-none-any.whl.
File metadata
- Download URL: pantheon_toolsets-0.5.4-py3-none-any.whl
- Upload date:
- Size: 592.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
73264a00af724ec983b1183879bf572568fd9d98f9ace812fcc6387c85abf77d
|
|
| MD5 |
310ecf95970d4707ab9528fbabc08a43
|
|
| BLAKE2b-256 |
d1017e818e77e21ba5a0070014b475c4a6ba6c0534caad6268f3478e806f11f0
|