Skip to main content

A protocol for autonomous LLM agents to navigate web applications via semantic landmarks.

Project description

elemm (Landmark Protocol)

The Universal AI-Native Backend Bridge. Turn any API into native AI tools in seconds.

elemm is a high-performance framework centered around the Model Context Protocol (MCP). It transforms standard REST endpoints into "AI Landmarks", enabling autonomous agents (like Claude or GPT) to discover and interact with your backend with zero-shot precision.


Core Strengths:

  • Plug-and-play MCP Support: Instantly compatible with Claude Desktop and Cursor.
  • AI-Native Navigation (v0.4.0): Built-in context hygiene via hierarchical "Drill-Down" discovery.
  • Automated Discovery: Auto-detects landmarks, schemas, and dependencies.
  • Managed Security: Native support for HTTPBearer, OAuth2, and APIKeys.
  • Self-Healing: Robust remedy instructions for autonomous error correction.

Key Features

Hierarchical Navigation (Context Hygiene)

Stop drowning your AI in hundreds of tools. elemm automatically organizes your API into a "Drill-Down" discovery flow using FastAPI tags:

@ai.landmark(id="explore_power", type="navigation")
@app.get("/power/overview", tags=["Power"]) # 'Power' tag creates the hierarchy
async def power_overview():
    """Energy grid management. Calling this reveals energy-specific tools."""
    return {"status": "energy_grid_online"}

@ai.landmark(id="deploy_battery", type="write")
@app.post("/power/deploy", tags=["Power"]) # Only visible AFTER entering 'Power'
async def deploy(battery_id: str):
    return {"status": "deployed"}
  • Root Level: Only entry points (Signposts) like explore_power are shown.
  • Module Level: Specialized tools are selectively loaded only when the AI "enters" a module.
  • Global Access: Critical tools (like search) can be marked with global_access=True to remain visible everywhere.

Hardened Security

Standard OpenAPI is noisy. elemm auto-detects security schemes and marks them as managed_by: protocol, so the AI knows exactly which headers are required without HTTP clutter.


Documentation


Quick Start

1. Installation

pip install elemm

2. Implementation

from elemm import FastAPIProtocolManager
from fastapi import FastAPI, Depends
from fastapi.security import HTTPBearer
from pydantic import BaseModel

app = FastAPI()
ai = FastAPIProtocolManager(agent_welcome="Welcome to the Support-OS.")
auth_scheme = HTTPBearer()

class Ticket(BaseModel):
    title: str
    priority: int = 1

@ai.landmark(id="get_categories", type="navigation")
@app.get("/categories")
async def list_cats():
    return ["Tech", "Billing", "General"]

@ai.landmark(id="create_ticket", type="write", remedy="If 400, ask for a clearer title.")
@app.post("/tickets")
async def create(ticket: Ticket, token: str = Depends(auth_scheme)):
    return {"id": "123", "status": "created"}

# Register and bind
app.include_router(ai.get_router())
ai.bind_to_app(app)

3. Advanced: Multiple Auth Schemes

elemm detects any dependency that inherits from SecurityBase. You can mix and match different schemes like API Keys and OAuth2:

from fastapi.security import APIKeyHeader

api_key_scheme = APIKeyHeader(name="X-Admin-Key", description="Admin access only")

@ai.landmark(id="wipe_logs", type="write", instructions="High-security action.")
@app.delete("/admin/logs")
async def delete_logs(key: str = Depends(api_key_scheme)):
    return {"status": "logs cleared"}

elemm will automatically mark the X-Admin-Key as managed_by: protocol in the manifest.


Why elemm? (Modern AI-Tooling Comparison)

Standard openapi.json is built for humans. Native MCP is great for simple scripts. elemm is built for complex, autonomous Enterprise agents.

Feature Standard OpenAPI Native MCP (Flat) elemm (Landmarks)
Noise Level High (HTTP Metadata) Medium (Flat-List) Low (Context Isolated)
Discovery Static (Manual) Static (Full Load) Dynamic (Hierarchical)
Error Handling Generic 4xx/5xx RAW Error Autonomous remedy
Managed Auth Manual Manual Zero-Config (Auto-Shield)
Max Scale Limited by Context ~50 Tools Unlimited (Scale out)
Token Cost factor 5x factor 1x factor < 0.1x (Clean)

Comparison Example: create_ticket

Standard openapi.json (Noisy)

"/tickets": {
  "post": {
    "summary": "Create",
    "operationId": "create_tickets_post",
    "parameters": [{ "name": "auth", "in": "header", "required": true, "schema": { "type": "string" } }],
    "requestBody": { "content": { "application/json": { "schema": { "$ref": "#/components/schemas/Ticket" } } } },
    "responses": { "200": { "description": "Successful Response", "content": { "application/json": {} } } }
  }
}

llm-landmarks.json (Optimized)

{
  "id": "create_ticket",
  "type": "write",
  "description": "No description provided.",
  "remedy": "If 400, ask for a clearer title.",
  "method": "POST",
  "url": "/tickets",
  "parameters": [
    { "name": "auth", "type": "string", "required": true, "managed_by": "protocol" }
  ],
  "payload": [
    { "name": "title", "type": "string", "required": true },
    { "name": "priority", "type": "integer", "required": false, "default": 1 }
  ]
}

The @landmark Decorator Reference

The decorator is the primary way to provide semantic context to an AI agent. Here is a detailed breakdown of all available options:

id (string, required)

The unique identifier for the action. The AI uses this ID to call the tool.

  • Example: id="process_payment"

type (string, required)

Defines the nature of the action. This helps the AI categorize its capabilities:

  • read: For fetching information (e.g., searching products, reading logs).
  • write: For actions that change state (e.g., creating a user, deleting a file).
  • navigation: For actions that explain the API's structure (e.g., listing categories, help endpoints).
  • Example: type="write"

description (string, optional)

The semantic instruction for the AI. If omitted, the function's docstring is used. This is the most important field for AI reasoning.

  • Example: description="Use this to find products by price range or keywords."

instructions (string, optional)

Specific "Rules of Engagement" for this action. Useful for enforcing business logic at the AI level.

  • Example: instructions="Always ask for the user's shipping address BEFORE calling this."

remedy (string, optional)

AI-Native Error Handling. Instructions for the agent on how to proceed if the API call fails or returns an error.

  • Example: remedy="If this returns a 402 (Payment Required), explain the premium subscription benefits to the user."

hidden (boolean, default: False)

If set to True, the landmark is registered in your code but excluded from the AI manifest.


Native MCP Support

elemm natively supports the Model Context Protocol (MCP). This allows you to export your entire API as a tool-kit for high-end AI clients in seconds:

  • Instant Integration: Works with Claude Desktop, Cursor, and other MCP-compatible agents.
  • Auto-Sync: Your AI tools are always in sync with your latest API deployment.
  • Bridge Logic: See examples/mcp_bridge.py for a ready-to-use bridge implementation.

Claude Desktop Integration

To connect your Landmarks to Claude, add the following to your claude_desktop_config.json:

{
  "mcpServers": {
    "landmarks": {
      "command": "python3",
      "args": ["/pfad/zu/deinem/repo/examples/mcp_bridge.py"],
      "env": {
        "LANDMARK_URLS": "http://localhost:8000"
      }
    }
  }
}

Practical Applications

elemm is universal. It is designed for any API that needs to be controlled or queried by an AI Assistant:

  • E-Commerce: Automate shopping flows, support, and inventory tracking.
  • Internal Dashboards & ERP: Turn complex data silos into conversational interfaces for employees.
  • IoT & Infrastructure: Control hardware, manage servers, or query sensors via natural language.
  • SaaS & Tooling: Enable your users to interact with your platform via AI agents autonomously.

Essentially, if it has an API, elemm makes it AI-Native.


Core Features and Hardening

Standard OpenAPI documentation is often too noisy for LLMs. elemm provides a hardened abstraction layer:

  • Rocksolid Security: Native support for FastAPI SecurityBase. Sensitive schemes (HTTPBearer, APIKey) are automatically detected and marked as managed_by: protocol.
  • Context Injection: Technical fields (e.g., Request, Session) and internal dependencies are automatically moved to a context scope, keeping the agent's input clean.
  • Pydantic V2 Support: Deep extraction of Enums, nested models, and field metadata.

Agent in Action

How an autonomous agent perceives and uses your elemm landmarks in a real conversation:

Agent Interaction Demo

Note the intelligence: In this example, the agent technically requires a JWT token for the next action. When the user provides simple credentials instead, the agent autonomously understands the context, performs the login, and proceeds with the add_to_cart call. He nailed it.


License

GNU General Public License v3.0. Created by Marc Stöcker.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

elemm-0.4.0.tar.gz (30.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

elemm-0.4.0-py3-none-any.whl (25.3 kB view details)

Uploaded Python 3

File details

Details for the file elemm-0.4.0.tar.gz.

File metadata

  • Download URL: elemm-0.4.0.tar.gz
  • Upload date:
  • Size: 30.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for elemm-0.4.0.tar.gz
Algorithm Hash digest
SHA256 a29da4ba458c340a951f2a00285c2fe5ec883f31aec3531ae69f1115cc4a240d
MD5 6d56321958cda11474efbbdf59f720e8
BLAKE2b-256 4455c619ab872ea732d61dc170a97728735e3a81e53cb93e6a643123854000fe

See more details on using hashes here.

Provenance

The following attestation bundles were made for elemm-0.4.0.tar.gz:

Publisher: workflow.yml on v3rm1ll1on/elemm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file elemm-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: elemm-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 25.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for elemm-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b4521c0b3cc14c8b9f113af896c30b5e61da22421a90aae283d1b48d568546fb
MD5 204b02fbc2bab035f95af5efbb1a3fa4
BLAKE2b-256 4215d397cee40d33e90a6490a31944d0c94aa89d01401fd2ca3bf2864c0f8341

See more details on using hashes here.

Provenance

The following attestation bundles were made for elemm-0.4.0-py3-none-any.whl:

Publisher: workflow.yml on v3rm1ll1on/elemm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page