Skip to main content

AI Landmarks: The specialized protocol for autonomous agent discovery.

Project description

Elemm: The Landmark Manifest Protocol

PyPI version License Python versions

The Infrastructure for the Agentic Web.

Elemm is the Landmark Manifest Protocol, a next-generation communication framework designed to transform how autonomous LLM agents interact with the digital world. Instead of static tool definitions, Elemm provides a dynamic, manifest-driven architecture that enables agents to discover, navigate, and execute complex workflows across distributed APIs with unprecedented efficiency.


The Vision: Agentic Web

In the Agentic Web, every API is a "Landmark". Agents no longer need massive, hardcoded system prompts to understand a service. They discover capabilities on-the-fly via a standardized manifest, just like a human navigates a website.

  • Unified Discovery: Every Elemm-compliant server exposes its structure at /.well-known/elemm-manifest.md.
  • Zero System Prompt: By providing rich semantic landmarks and manifest-driven discovery, you can eliminate thousands of tokens from your system prompts. The protocol is the documentation.
  • One MCP Server, Infinite APIs: The built-in Elemm Gateway connects to any OpenAPI, GraphQL, or native Elemm service. A single pip install gives you a universal MCP server that discovers and loads landmarks on-the-fly, allowing you to scale your agent's capabilities without ever restarting your infrastructure.

The Philosophy: Decoupling Intelligence

Elemm is more than just a protocol; it's a shift toward Decentralized Intelligence. In the traditional SaaS model, providers often bundle their APIs with expensive, centralized LLM interfaces. Elemm decouples the "Body" (the API) from the "Brain" (the Agent).

Bring Your Own Agent (BYOA)

With Elemm, API providers only define the Landmarks and Manifests. The user brings their own autonomous agent to the platform. This shifts the computational burden and cost of "reasoning" to the edge—the user's own system.

Sustainability & Efficiency

By eliminating the need for massive, repetitive system prompts and context-heavy tool injections, Elemm significantly reduces the global token footprint of AI interactions.

  • Lower Latency: No more waiting for centralized "gatekeeper" models to process 20k tokens of documentation.
  • Reduced CO2 & Energy: Fewer tokens mean less GPU compute time, directly translating into a lower carbon footprint for every autonomous task.
  • Cost Sovereignty: Providers save on LLM hosting and token costs, while users get the freedom to choose the model that best fits their task and budget.

Core Advantages

Standard protocols like MCP often struggle with large-scale toolsets. Elemm provides a structural solution:

  • Efficient Discovery: Agents only see a high-level manifest, loading detailed tool schemas only when needed (on-demand inspection).
  • Atomic Sequencing: Execute multiple tool calls in a single LLM turn with native variable piping ($step0.id).
  • Multi-Protocol Gateway: Connect to any OpenAPI, GraphQL, or native Elemm service through a single MCP server.
  • Security Policy Engine: Built-in Guardian mode with pattern blacklists, landmark restrictions, and HTTP method filtering.
  • SmartRepair Engine: Built-in error handling that provides agents with actionable remedies instead of cryptic stack traces.
  • Token Economy: Reduces input tokens by up to 90% in complex forensic and administrative scenarios.

Documentation


Quick Start

1. Install

python3 -m venv .venv
source .venv/bin/activate  # Windows: .venv\Scripts\activate
pip install elemm

2. Connect your AI Agent (MCP Client)

The fastest way to use Elemm is via the built-in Gateway. It acts as a universal MCP server that turns any OpenAPI or GraphQL API into a tool server.

Do not run this manually in your terminal. Instead, configure your AI agent (like Claude Desktop or Cursor) to run the elemm-gateway command.

Claude Desktop (claude_desktop_config.json):

{
  "mcpServers": {
    "elemm-gateway": {
      "command": "/absolute/path/to/project/.venv/bin/python3",
      "args": ["-m", "elemm_gateway.cli"]
    }
  }
}

(Note: Use the absolute path to elemm-gateway if it is not in your system PATH).

3. Start Discovering

Once connected, tell your agent:

"Use Elemm to connect to https://petstore.swagger.io/v2/swagger.json and list all available pets."

The Gateway provides exactly 8 core tools to the agent. All domain-specific actions are discovered on-the-fly via the Elemm protocol.

4. Build Your Own Landmark Server (Optional)

Elemm uses a decorator-based approach to turn standard Python functions into high-performance landmarks:

from elemm import AIProtocolManager, MetadataRegistry

registry = MetadataRegistry("landmarks.yaml")
manager = AIProtocolManager(registry=registry)

@manager.landmark("security:quarantine_node")
async def quarantine_node(node_id: str, urgent: bool = False):
    """Quarantines a compromised server node."""
    return {"status": "success", "node": node_id}

Advanced Usage

  • Pydantic Discovery: Elemm automatically generates schemas from Pydantic models.
  • Response Hygiene: Built-in _select, _filter, and _limit parameters prevent context overflow.
  • Session Isolation: Use session_id to run parallel tasks without cross-contamination.
  • Self-Healing: The SmartRepair engine provides agents with actionable remedies when errors occur.

License

Copyright (C) 2026 Marc Stöcker. GPLv3 License. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

elemm-1.1.3.tar.gz (73.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

elemm-1.1.3-py3-none-any.whl (71.2 kB view details)

Uploaded Python 3

File details

Details for the file elemm-1.1.3.tar.gz.

File metadata

  • Download URL: elemm-1.1.3.tar.gz
  • Upload date:
  • Size: 73.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for elemm-1.1.3.tar.gz
Algorithm Hash digest
SHA256 e245610e075e2878b52e442709fc07b427585d22afdbbef1987596278e09a65d
MD5 3cc93853e0c245f21077d589770847e4
BLAKE2b-256 9e799dc6edd3b3d910aaa2c87cd95afbea50ae05ea70d6c399fa4890746e2561

See more details on using hashes here.

Provenance

The following attestation bundles were made for elemm-1.1.3.tar.gz:

Publisher: workflow.yml on v3rm1ll1on/elemm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file elemm-1.1.3-py3-none-any.whl.

File metadata

  • Download URL: elemm-1.1.3-py3-none-any.whl
  • Upload date:
  • Size: 71.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for elemm-1.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 955acd88d4631d57a54c18bfea7c12f16326a7e0d9397ccec31a4a4d4d974e0b
MD5 ffeac218d6efb1f3f222ae5ddd6ab1ab
BLAKE2b-256 8c4cfabbfce8bbcd642a9ce3b88bebc9e66039b768eefd8fc75409e3aaa25496

See more details on using hashes here.

Provenance

The following attestation bundles were made for elemm-1.1.3-py3-none-any.whl:

Publisher: workflow.yml on v3rm1ll1on/elemm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page