No project description provided
Project description
🧠 jupyter-server-ai-tools
A Jupyter Server extension for discovering and aggregating callable tools from other extensions.
This project provides a structured way for extensions to declare tools using ToolDefinition
objects, and for agents or other consumers to retrieve those tools — with optional metadata validation.
✨ Features
- ✅ Simple, declarative
ToolDefinition
API for registering callable tools - ✅ Automatic metadata inference from Python function signature and docstring
- ✅
find_tools()
for discovering tools from all installed Jupyter server extensions - ✅
run_tools()
for executing tools from structured call objects (supports sync, async, and multiple tool call formats) - ✅ Built-in support for OpenAI, Anthropic, MCP, and Vercel tool call schemas
- ✅ Custom parser support for user-defined tool call formats
- ✅ Clean separation between tool metadata and callable execution
- ✅ Optional JSON Schema validation to enforce tool structure at definition time
📦 Install
pip install jupyter_server_ai_tools
To install for development:
git clone https://github.com/Abigayle-Mercer/jupyter-server-ai-tools.git
cd jupyter-server-ai-tools
pip install -e ".[lint,test]"
Usage
Expose tools in your own extensions:
from jupyter_server_ai_tools.models import ToolDefinition
def greet(name: str):
"""Say hello to someone."""
return f"Hello, {name}!"
def jupyter_server_extension_tools():
return [ToolDefinition(callable=greet)]
Discover tools from all extensions:
from jupyter_server_ai_tools.tool_registry import find_tools
tools = find_tools(extension_manager)
Execute tools via structured calls:
The run_tools()
function allows dynamic execution of tool calls using a standard format such as "mcp"
, "openai"
, "anthropic"
, or "vercel"
:
from jupyter_server_ai_tools.tool_registry import run_tools
tool_calls = [
{"name": "greet", "input": {"name": "Abigayle"}}
]
results = await run_tools(
extension_manager=serverapp.extension_manager,
tool_calls=tool_calls,
parse_fn="mcp"
)
🧪 Running Tests
pip install -e ".[test]"
pytest
🧼 Linting and Formatting
pip install -e ".[lint]"
bash .github/workflows/lint.sh
Tool Output Example
Given the greet()
tool above, find_tools(return_metadata_only=True)
will return:
[
{
"name": "greet",
"description": "Say hello to someone.",
"inputSchema": {
"type": "object",
"properties": {
"name": { "type": "string" }
},
"required": ["name"]
}
}
]
Impact
This system enables:
- Extension authors to register tools with minimal effort
- Agent builders to dynamically discover and bind tools
- Compatibility with multiple tool call formats, including OpenAI, Anthropic, MCP, and Vercel
🧹 Uninstall
pip uninstall jupyter_server_ai_tools
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file jupyter_server_ai_tools-0.1.2.tar.gz
.
File metadata
- Download URL: jupyter_server_ai_tools-0.1.2.tar.gz
- Upload date:
- Size: 11.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
737078189224a38f5b409ec9e708e27f83f8d9d94b2fa71a2755b74cb35ccb55
|
|
MD5 |
fc552b22a3c4e0e0d63027010dc33948
|
|
BLAKE2b-256 |
9f186098faa93dfef802e97b604c97398a872993d2ceafc42859d8822f603550
|
File details
Details for the file jupyter_server_ai_tools-0.1.2-py3-none-any.whl
.
File metadata
- Download URL: jupyter_server_ai_tools-0.1.2-py3-none-any.whl
- Upload date:
- Size: 9.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
dff4031f42922b3dbbf1a37622e7baa173abfe0fe3a2ec4439ac13d47f51c119
|
|
MD5 |
7994247eaacfccddb7c6a010a2659a4a
|
|
BLAKE2b-256 |
f1f32482d4dc534ab272e065b49e3e5e0d8ace429872b04d897559d852f4d639
|