Neuronum SDK
Project description
Neuronum SDK
About the Neuronum SDK
The Neuronum SDK provides everything you need to setup your favorite AI model as self-hosted agentic backend. It includes the Neuronum Server, an open source agent-wrapper that transforms your model into an executable assistant that can be managed and called with the Neuronum Client API and the Neuronum Tools CLI to develop and publish MCP-compliant Tools that can be installed locally on your Neuronum Server
Requirements
- Python >= 3.8
- CUDA-compatible GPU (for Neuronum Server)
- CUDA Toolkit (for Neuronum Server)
Getting Started with the Neuronum SDK
In this brief getting started guide, you will:
- Connect to Neuronum
- Deploy a Model with Neuronum Server
- Call your Agent / Neuronum Client API
- Create & Manage a Custom Tool
Connect To Neuronum
Installation
Create and activate a virtual environment:
python3 -m venv ~/neuronum-venv
source ~/neuronum-venv/bin/activate
Install the Neuronum SDK:
pip install neuronum==2025.12.0.dev6
Note: Always activate this virtual environment (
source ~/neuronum-venv/bin/activate) before running anyneuronumcommands.
Create a Neuronum Cell
The Neuronum Cell is your secure identity to interact with the Network
neuronum create-cell
Connect your Cell
neuronum connect-cell
Neuronum Server
Neuronum Server is an agent-wrapper that transforms your model into an agentic backend server that can interact with the Neuronum Client API and installed tools
Setup the Server
Clone the neuronum-server repository:
git clone https://github.com/neuronumcybernetics/neuronum-server.git
cd neuronum-server
Run the setup script:
bash start_neuronum_server.sh
The setup script will:
- Create a Python virtual environment
- Install all dependencies (vLLM, PyTorch, etc.)
- Start the vLLM server in the background
- Launch the Neuronum Server
Viewing Logs
tail -f server.log
tail -f vllm_server.log
Stopping the Server
bash stop_neuronum_server.sh
What the Server Does
Once running, the server will:
- Connect to the Neuronum network using your Cell credentials
- Initialize a local SQLite database for conversation memory and knowledge storage
- Auto-discover and launch any MCP servers in the
tools/directory - Process messages from clients via the Neuronum network
- Execute scheduled tasks defined in the
tasks/directory
Neuronum Client API
Manage and call your Agent with the Neuronum Client API using different message types
import asyncio
from neuronum import Cell
async def main():
async with Cell() as cell:
# ============================================
# Example 1: Send a prompt to your Agent
# ============================================
prompt_data = {
"type": "prompt",
"prompt": "Explain what a black hole is in one sentence"
}
tx_response = await cell.activate_tx(prompt_data)
print(tx_response)
# ============================================
# Example 2: Call a Tool with natural language
# ============================================
tool_call_data = {
"type": "call_tool",
"tool_id": "your-tool-id", # The tool you want to use
"prompt": "Send an email to john@example.com with subject 'Meeting' and body 'See you at 3pm'"
}
tx_response = await cell.activate_tx(tool_call_data)
print(tx_response)
# ============================================
# Example 3: Knowledge Management
# ============================================
# Add knowledge to agent's database
add_knowledge_data = {
"type": "add_knowledge",
"knowledge_topic": "Company Policy",
"knowledge_data": "Our company operates from 9 AM to 5 PM Monday through Friday."
}
tx_response = await cell.activate_tx(add_knowledge_data)
# Update existing knowledge
update_knowledge_data = {
"type": "update_knowledge",
"knowledge_id": "12345", # ID from previous add
"knowledge_data": "Updated: Company operates 8 AM to 6 PM Monday through Friday."
}
tx_response = await cell.activate_tx(update_knowledge_data)
# Fetch all knowledge
fetch_data = {"type": "fetch_all_knowledge"}
knowledge_list = await cell.activate_tx(fetch_data)
print(knowledge_list)
# Delete knowledge
delete_knowledge_data = {
"type": "delete_knowledge",
"knowledge_id": "12345"
}
tx_response = await cell.activate_tx(delete_knowledge_data)
# ============================================
# Example 4: Tool Management
# ============================================
# Get all installed tools and tasks
get_tools_data = {"type": "get_tools"}
tools_info = await cell.activate_tx(get_tools_data)
print(tools_info)
# Add a tool (requires tool to be published)
# Use stream() instead of activate_tx() to listen for agent restart
add_tool_data = {
"type": "add_tool",
"tool_id": "019ac60e-cccc-7af5-b087-f6fcf1ba1299"
}
await cell.stream(cell.host, add_tool_data)
# Agent will restart and send "ping" when ready
# Delete a tool
delete_tool_data = {
"type": "delete_tool",
"tool_id": "019ac60e-cccc-7af5-b087-f6fcf1ba1299"
}
await cell.stream(cell.host, delete_tool_data)
# ============================================
# Example 5: Task Scheduling (Automated Workflows)
# ============================================
# Add a scheduled task
add_task_data = {
"type": "add_task",
"name": "Daily Report",
"description": "Send daily summary email",
"tool_id": "email-tool-id",
"function_name": "send_email",
"input_type": "prompt", # or "static"
"input_data": "Send daily summary to manager@company.com",
"schedule": "weekdays@1704067200,1704153600" # Days@Unix timestamps
}
await cell.stream(cell.host, add_task_data)
# Delete a task
delete_task_data = {
"type": "delete_task",
"task_id": "task-uuid-here"
}
await cell.stream(cell.host, delete_task_data)
# ============================================
# Example 6: Agent Status & Logs
# ============================================
# Check if agent is running
status_data = {"type": "get_agent_status"}
status = await cell.activate_tx(status_data)
print(status) # Returns: {"json": "agent running"}
# Download agent logs
log_data = {"type": "download_log"}
logs = await cell.activate_tx(log_data)
print(logs["json"]["log"]) # Full log content
if __name__ == '__main__':
asyncio.run(main())
Neuronum Tools CLI
Neuronum Tools are MCP-compliant (Model Context Protocol) plugins that can be installed on the Neuronum Server and extend your Agent's functionality, enabling it to interact with external data sources and your system.
Initialize a Tool
neuronum init-tool
You will be prompted to enter a tool name and description (e.g., "Test Tool" and "A simple test tool"). This will create a new folder named using the format: Tool Name_ToolID (e.g., Test Tool_019ac60e-cccc-7af5-b087-f6fcf1ba1299)
This folder will contain 2 files:
- tool.config - Configuration and metadata for your tool
- tool.py - Your Tool/MCP server implementation
Example tool.config:
{
"tool_meta": {
"tool_id": "019ac60e-cccc-7af5-b087-f6fcf1ba1299",
"version": "1.0.0",
"name": "Test Tool",
"description": "A simple test tool",
"audience": "private",
"logo": "https://neuronum.net/static/logo_new.png"
},
"legals": {
"terms": "https://url_to_your/terms",
"privacy_policy": "https://url_to_your/privacy_policy"
},
"requirements": [],
"variables": []
}
Example tool.py:
from mcp.server.fastmcp import FastMCP
# Create server instance
mcp = FastMCP("simple-example")
@mcp.tool()
def echo(message: str) -> str:
"""Echo back a message"""
return f"Echo: {message}"
if __name__ == "__main__":
mcp.run()
Update a Tool
After modifying your tool.config or tool.py files, submit the updates using:
neuronum update-tool
Delete a Tool
neuronum delete-tool
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file neuronum-2025.12.0.dev6.tar.gz.
File metadata
- Download URL: neuronum-2025.12.0.dev6.tar.gz
- Upload date:
- Size: 19.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6b699917ba92b3161d143e3acd458509567763a90e51a49b2424db88a83b5610
|
|
| MD5 |
b14eb0e85e1dc84bce6c8bf7092adaf9
|
|
| BLAKE2b-256 |
4401d72454f0e9fd09e48c529a82da1eb67a349988c5532ffdb4ddeecba001ec
|
File details
Details for the file neuronum-2025.12.0.dev6-py3-none-any.whl.
File metadata
- Download URL: neuronum-2025.12.0.dev6-py3-none-any.whl
- Upload date:
- Size: 17.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f6ff69f9e46766aa2d3aa5fec574636b2408d9ad8f44ed9e1d2d9f7dfed18b42
|
|
| MD5 |
4f6da8e6d41145317e57e7dc22255a65
|
|
| BLAKE2b-256 |
438a3dac9771f5544576be1f8841b37fad86a14c49590acc232f2504dafd5af6
|