Skip to main content

A Python toolsets library

Project description

Toolsets Logo

toolsets

A Python library for aggregating multiple MCP (Model Context Protocol) servers into a single unified MCP server. Toolsets acts as a pass-through server that combines tools from multiple sources and provides semantic search capabilities for deferred tool loading.

Features

  • MCP Server Aggregation: Combine tools from multiple Gradio Spaces and MCP servers and expose all aggregated tools through a single MCP endpoint (optional, enabled with mcp_server=True).
  • Free hosting on Hugging Face Spaces: A Toolset itself is also a Gradio application (including a built-in UI for testing and exploring available tools), so you can host it for free on Hugging Face Spaces
  • Deferred Tool Loading: Use semantic search to discover and load tools on-demand. Like Claude's Advanced Tool Usage but for any LLM. This is useful when you have 100s of tools or more as it can save the context length of your model.

Example Toolset

Check out a live example: https://huggingface.co/spaces/abidlabs/podcasting-toolset

image

Create Your Own Toolset

Installation

pip install toolsets

For deferred tool loading with semantic search:

pip install toolsets[deferred]

Examples

A

from toolsets import Server, Toolset

# Create a toolset
t = Toolset("My Tools")

# Add tools from MCP servers on Spaces or arbitrary URLs 
t.add(Server("gradio/mcp_tools"))
t.add(Server("username/space-name"))

# Launch UI at http://localhost:7860
# MCP server available at http://localhost:7860/gradio_api/mcp (when mcp_server=True)
t.launch(mcp_server=True)

Deferred Tool Loading

from toolsets import Server, Toolset

t = Toolset("My Tools")

# Add tools with deferred loading (enables semantic search)
t.add(Server("gradio/mcp_tools"), defer_loading=True)

# Regular tools are immediately available
t.add(Server("gradio/mcp_letter_counter_app"))

# Launch with MCP server enabled
t.launch(mcp_server=True)

When tools are added with defer_loading=True:

  • Tools are not exposed in the base tools list
  • Two special MCP tools are added: "Search Deferred Tools" and "Call Deferred Tool"
  • A search interface is available in the Gradio UI for finding deferred tools
  • Tools can be discovered using semantic search based on natural language queries

MCP Server Configuration

By default, launch() only starts the Gradio UI without the MCP server. To enable the MCP server endpoint, pass mcp_server=True:

from toolsets import Server, Toolset

t = Toolset("My Tools")
t.add(Server("gradio/mcp_tools"))

# Launch UI only (no MCP server)
t.launch()

# Launch UI with MCP server at http://localhost:7860/gradio_api/mcp
t.launch(mcp_server=True)

When mcp_server=True, the MCP server is available at /gradio_api/mcp and a configuration tab is shown in the UI with the connection details.

Custom Embedding Model

from toolsets import Toolset

# Use a different sentence-transformers model
t = Toolset("My Tools", embedding_model="all-mpnet-base-v2")
t.add(Server("gradio/mcp_tools"), defer_loading=True)
t.launch(mcp_server=True)

Deploying to Hugging Face Spaces

To deploy your toolset to Hugging Face Spaces:

  1. Go to https://huggingface.co/new-space
  2. Select the Gradio SDK
  3. Create your toolset file (e.g., app.py) with your toolset code
  4. Add a requirements.txt file with toolsets (and optionally toolsets[deferred] for semantic search)

Your toolset will be available as both a Gradio UI and an MCP server endpoint.

Roadmap

Upcoming features and improvements:

  • Hugging Face Token Support: Automatic token passing in headers for private and ZeroGPU spaces
  • Hugging Face Data Types Integration:
    • Datasets: Add Hugging Face datasets for easy RAG on documentation and structured data
    • Models: Support for models with inference provider usage (e.g., Inference API, Inference Endpoints)
    • Papers: Search and query capabilities for Hugging Face Papers
  • Enhanced Error Handling: Better retry logic, connection pooling, and graceful degradation
  • Tool Caching: Cache tool definitions and embeddings to reduce API calls and improve startup time

Contributing

We welcome contributions! Please see CONTRIBUTING.md for guidelines.

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

toolsets-0.1.4-py3-none-any.whl (14.6 kB view details)

Uploaded Python 3

File details

Details for the file toolsets-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: toolsets-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 14.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.2.0 CPython/3.10.12

File hashes

Hashes for toolsets-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 f5dec1171d1116dcb888fead7f53e0f1b47d36df319c37031bee38a4e6a99616
MD5 70dc52d64695fd51e310bba3d72ca099
BLAKE2b-256 18d0f99c309d2a351399f1240ff4fc7cb1b87e5fae14e4c62281222f6ac36112

See more details on using hashes here.

Provenance

The following attestation bundles were made for toolsets-0.1.4-py3-none-any.whl:

Publisher: publish.yml on abidlabs/toolsets

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page