Skip to main content

AI Landmarks: The specialized protocol for autonomous agent discovery.

Project description

Elemm: The Landmark Manifest Protocol

PyPI version Build Status License Python versions

The Infrastructure for the Agentic Web.

Elemm is the Landmark Manifest Protocol, a next-generation communication framework designed to transform how autonomous LLM agents interact with the digital world. Instead of static tool definitions, Elemm provides a dynamic, manifest-driven architecture that enables agents to discover, navigate, and execute complex workflows across distributed APIs with unprecedented efficiency.


The Vision: Agentic Web

In the Agentic Web, every API is a "Landmark". Agents no longer need massive, hardcoded system prompts to understand a service. They discover capabilities on-the-fly via a standardized manifest, just like a human navigates a website.

  • Unified Discovery: Every Elemm-compliant server exposes its structure at /.well-known/elemm-manifest.md.
  • Zero System Prompt: By providing rich semantic landmarks and manifest-driven discovery, you can eliminate thousands of tokens from your system prompts. The protocol is the documentation.
  • One MCP Server, Infinite APIs: The built-in Elemm Gateway connects to any OpenAPI, GraphQL, or native Elemm service. A single pip install gives you a universal MCP server that discovers and loads landmarks on-the-fly, allowing you to scale your agent's capabilities without ever restarting your infrastructure.

The Philosophy: Decoupling Intelligence

Elemm is more than just a protocol; it's a shift toward Decentralized Intelligence. In the traditional SaaS model, providers often bundle their APIs with expensive, centralized LLM interfaces. Elemm decouples the "Body" (the API) from the "Brain" (the Agent).

Bring Your Own Agent (BYOA)

With Elemm, API providers only define the Landmarks and Manifests. The user brings their own autonomous agent to the platform. This shifts the computational burden and cost of "reasoning" to the edge—the user's own system.

Sustainability & Efficiency

By eliminating the need for massive, repetitive system prompts and context-heavy tool injections, Elemm significantly reduces the global token footprint of AI interactions.

  • Lower Latency: No more waiting for centralized "gatekeeper" models to process 20k tokens of documentation.
  • Reduced CO2 & Energy: Fewer tokens mean less GPU compute time, directly translating into a lower carbon footprint for every autonomous task.
  • Cost Sovereignty: Providers save on LLM hosting and token costs, while users get the freedom to choose the model that best fits their task and budget.

Core Advantages

Standard protocols like MCP often struggle with large-scale toolsets. Elemm provides a structural solution:

  • Efficient Discovery: Agents only see a high-level manifest, loading detailed tool schemas only when needed (on-demand inspection).
  • Atomic Sequencing: Execute multiple tool calls in a single LLM turn with native variable piping ($step0.id).
  • Multi-Protocol Gateway: Connect to any OpenAPI, GraphQL, or native Elemm service through a single MCP server.
  • Security Policy Engine: Built-in Guardian mode with pattern blacklists, landmark restrictions, and HTTP method filtering.
  • SmartRepair Engine: Built-in error handling that provides agents with actionable remedies instead of cryptic stack traces.
  • Token Economy: Reduces input tokens by up to 90% in complex forensic and administrative scenarios.

Documentation


Quick Start

1. Install

python3 -m venv .venv
source .venv/bin/activate  # Windows: .venv\Scripts\activate
pip install elemm

2. Connect your AI Agent (MCP Client)

The fastest way to use Elemm is via the built-in Gateway. It acts as a universal MCP server that turns any OpenAPI or GraphQL API into a tool server.

Do not run this manually in your terminal. Instead, configure your AI agent (like Claude Desktop or Cursor) to run the elemm-gateway command.

Claude Desktop (claude_desktop_config.json):

{
  "mcpServers": {
    "elemm-gateway": {
      "command": "/absolute/path/to/project/.venv/bin/python3",
      "args": ["-m", "elemm_gateway.cli"]
    }
  }
}

(Note: Use the absolute path to elemm-gateway if it is not in your system PATH).

3. Start Discovering

Once connected, tell your agent:

"Use Elemm to connect to https://petstore.swagger.io/v2/swagger.json and list all available pets."

The Gateway provides exactly 8 core tools to the agent. All domain-specific actions are discovered on-the-fly via the Elemm protocol.

4. Build Your Own Landmark Server (Optional)

Elemm uses a decorator-based approach to turn standard Python functions into high-performance landmarks:

from elemm import AIProtocolManager, MetadataRegistry

registry = MetadataRegistry("landmarks.yaml")
manager = AIProtocolManager(registry=registry)

@manager.landmark("security:quarantine_node")
async def quarantine_node(node_id: str, urgent: bool = False):
    """Quarantines a compromised server node."""
    return {"status": "success", "node": node_id}

Advanced Usage

  • Pydantic Discovery: Elemm automatically generates schemas from Pydantic models.
  • Response Hygiene: Built-in _select, _filter, and _limit parameters prevent context overflow.
  • Session Isolation: Use session_id to run parallel tasks without cross-contamination.
  • Self-Healing: The SmartRepair engine provides agents with actionable remedies when errors occur.

License

Copyright (C) 2026 Marc Stöcker. GPLv3 License. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

elemm-1.1.4.tar.gz (75.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

elemm-1.1.4-py3-none-any.whl (72.6 kB view details)

Uploaded Python 3

File details

Details for the file elemm-1.1.4.tar.gz.

File metadata

  • Download URL: elemm-1.1.4.tar.gz
  • Upload date:
  • Size: 75.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for elemm-1.1.4.tar.gz
Algorithm Hash digest
SHA256 b62c0f0ec6ff564b8f70857088262e53337df0e65098c93312c27270188b5f4f
MD5 8938397f23d4cb16f5c55db3817427b5
BLAKE2b-256 d0f3096fcd8195873aac0051e9e2eeb840021992521b79116608f729ca154568

See more details on using hashes here.

Provenance

The following attestation bundles were made for elemm-1.1.4.tar.gz:

Publisher: workflow.yml on v3rm1ll1on/elemm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file elemm-1.1.4-py3-none-any.whl.

File metadata

  • Download URL: elemm-1.1.4-py3-none-any.whl
  • Upload date:
  • Size: 72.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for elemm-1.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 aa3ae40b85245f9555adf207a0e4a16d687e96ab20f365eabddafb8fa3695370
MD5 12eeeb850f241172e19daa0de2c42ef7
BLAKE2b-256 b875bdb2719a676bd03524430db2849063ead0015c8972fc3aa8e6cf08c5d844

See more details on using hashes here.

Provenance

The following attestation bundles were made for elemm-1.1.4-py3-none-any.whl:

Publisher: workflow.yml on v3rm1ll1on/elemm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page