Skip to main content

A protocol for autonomous LLM agents to navigate web applications via semantic landmarks.

Project description

elemm (Landmark Protocol)

The Universal AI-Native Backend Bridge. Turn any API into native AI tools in seconds.

elemm is a high-performance framework centered around the Model Context Protocol (MCP). It transforms standard REST endpoints into "AI Landmarks", enabling autonomous agents (like Claude or GPT) to discover and interact with your backend with zero-shot precision.


Core Strengths:

  • Plug-and-play MCP Support: Instantly compatible with Claude Desktop and Cursor.
  • Automated Discovery: Auto-detects landmarks, schemas, and dependencies.
  • AI-Native Safety: Built-in protection against hallucinations and security exposures.

Quick Start

1. Installation

pip install elemm

2. Implementation

from elemm import FastAPIProtocolManager
from fastapi import FastAPI, Depends
from fastapi.security import HTTPBearer
from pydantic import BaseModel

app = FastAPI()
ai = FastAPIProtocolManager(agent_welcome="Welcome to the Support-OS.")
auth_scheme = HTTPBearer()

class Ticket(BaseModel):
    title: str
    priority: int = 1

@ai.landmark(id="get_categories", type="navigation")
@app.get("/categories")
async def list_cats():
    return ["Tech", "Billing", "General"]

@ai.landmark(id="create_ticket", type="write", remedy="If 400, ask for a clearer title.")
@app.post("/tickets")
async def create(ticket: Ticket, token: str = Depends(auth_scheme)):
    return {"id": "123", "status": "created"}

# Register and bind
app.include_router(ai.get_router())
ai.bind_to_app(app)

3. Advanced: Multiple Auth Schemes

elemm detects any dependency that inherits from SecurityBase. You can mix and match different schemes like API Keys and OAuth2:

from fastapi.security import APIKeyHeader

api_key_scheme = APIKeyHeader(name="X-Admin-Key", description="Admin access only")

@ai.landmark(id="wipe_logs", type="write", instructions="High-security action.")
@app.delete("/admin/logs")
async def delete_logs(key: str = Depends(api_key_scheme)):
    return {"status": "logs cleared"}

elemm will automatically mark the X-Admin-Key as managed_by: protocol in the manifest.


Why elemm? (OpenAPI vs. Landmark Manifest)

Standard openapi.json is built for humans and documentation. It is full of HTTP-noise that confuses LLMs. elemm produces a "Hardened Manifest" optimized for action.

Feature Standard OpenAPI elemm Landmark
Noise Level High (Responses, Content-Types, etc.) Low (Action-First)
Tool Calling Complex paths & methods Unique Action-IDs
Security AI must handle tokens (Risky/Noisy) Automated Security Detection (Native Support for HTTPBearer, OAuth2, APIKey)
Managed Auth Manual configuration needed Zero-Config: Auto-detects SecurityBase
Error Handling Generic 4xx/5xx Functional remedy instructions
Context Exposes internal fields (Request/Sess) Clean Context-Isolation

Comparison Example: create_ticket

Standard openapi.json (Noisy)

"/tickets": {
  "post": {
    "summary": "Create",
    "operationId": "create_tickets_post",
    "parameters": [{ "name": "auth", "in": "header", "required": true, "schema": { "type": "string" } }],
    "requestBody": { "content": { "application/json": { "schema": { "$ref": "#/components/schemas/Ticket" } } } },
    "responses": { "200": { "description": "Successful Response", "content": { "application/json": {} } } }
  }
}

llm-landmarks.json (Optimized)

{
  "id": "create_ticket",
  "type": "write",
  "description": "No description provided.",
  "remedy": "If 400, ask for a clearer title.",
  "method": "POST",
  "url": "/tickets",
  "parameters": [
    { "name": "auth", "type": "string", "required": true, "managed_by": "protocol" }
  ],
  "payload": [
    { "name": "title", "type": "string", "required": true },
    { "name": "priority", "type": "integer", "required": false, "default": 1 }
  ]
}

The @landmark Decorator Reference

The decorator is the primary way to provide semantic context to an AI agent. Here is a detailed breakdown of all available options:

id (string, required)

The unique identifier for the action. The AI uses this ID to call the tool.

  • Example: id="process_payment"

type (string, required)

Defines the nature of the action. This helps the AI categorize its capabilities:

  • read: For fetching information (e.g., searching products, reading logs).
  • write: For actions that change state (e.g., creating a user, deleting a file).
  • navigation: For actions that explain the API's structure (e.g., listing categories, help endpoints).
  • Example: type="write"

description (string, optional)

The semantic instruction for the AI. If omitted, the function's docstring is used. This is the most important field for AI reasoning.

  • Example: description="Use this to find products by price range or keywords."

instructions (string, optional)

Specific "Rules of Engagement" for this action. Useful for enforcing business logic at the AI level.

  • Example: instructions="Always ask for the user's shipping address BEFORE calling this."

remedy (string, optional)

AI-Native Error Handling. Instructions for the agent on how to proceed if the API call fails or returns an error.

  • Example: remedy="If this returns a 402 (Payment Required), explain the premium subscription benefits to the user."

hidden (boolean, default: False)

If set to True, the landmark is registered in your code but excluded from the AI manifest.


Native MCP Support

elemm natively supports the Model Context Protocol (MCP). This allows you to export your entire API as a tool-kit for high-end AI clients in seconds:

  • Instant Integration: Works with Claude Desktop, Cursor, and other MCP-compatible agents.
  • Auto-Sync: Your AI tools are always in sync with your latest API deployment.
  • Bridge Logic: See examples/mcp_bridge.py for a ready-to-use bridge implementation.

Claude Desktop Integration

To connect your Landmarks to Claude, add the following to your claude_desktop_config.json:

{
  "mcpServers": {
    "landmarks": {
      "command": "python3",
      "args": ["/pfad/zu/deinem/repo/examples/mcp_bridge.py"],
      "env": {
        "LANDMARK_URLS": "http://localhost:8000"
      }
    }
  }
}

Practical Applications

elemm is universal. It is designed for any API that needs to be controlled or queried by an AI Assistant:

  • E-Commerce: Automate shopping flows, support, and inventory tracking.
  • Internal Dashboards & ERP: Turn complex data silos into conversational interfaces for employees.
  • IoT & Infrastructure: Control hardware, manage servers, or query sensors via natural language.
  • SaaS & Tooling: Enable your users to interact with your platform via AI agents autonomously.

Essentially, if it has an API, elemm makes it AI-Native.


Core Features and Hardening

Standard OpenAPI documentation is often too noisy for LLMs. elemm provides a hardened abstraction layer:

  • Rocksolid Security: Native support for FastAPI SecurityBase. Sensitive schemes (HTTPBearer, APIKey) are automatically detected and marked as managed_by: protocol.
  • Context Injection: Technical fields (e.g., Request, Session) and internal dependencies are automatically moved to a context scope, keeping the agent's input clean.
  • Pydantic V2 Support: Deep extraction of Enums, nested models, and field metadata.

Agent in Action

How an autonomous agent perceives and uses your elemm landmarks in a real conversation:

Agent Interaction Demo

Note the intelligence: In this example, the agent technically requires a JWT token for the next action. When the user provides simple credentials instead, the agent autonomously understands the context, performs the login, and proceeds with the add_to_cart call. He nailed it.


License

GNU General Public License v3.0. Created by Marc Stöcker.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

elemm-0.3.11.tar.gz (27.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

elemm-0.3.11-py3-none-any.whl (23.8 kB view details)

Uploaded Python 3

File details

Details for the file elemm-0.3.11.tar.gz.

File metadata

  • Download URL: elemm-0.3.11.tar.gz
  • Upload date:
  • Size: 27.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for elemm-0.3.11.tar.gz
Algorithm Hash digest
SHA256 f743ef2119ede518ff3262f958c4cfeab01f562b78e1c48c0da2d429719be9f9
MD5 80b2c117027be6f9198d76badc4d4e6b
BLAKE2b-256 bfc7f1467c88ec8331eb5f703274d3edcf187cc6c1d25c4f6e194c6921ae1126

See more details on using hashes here.

Provenance

The following attestation bundles were made for elemm-0.3.11.tar.gz:

Publisher: workflow.yml on v3rm1ll1on/elemm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file elemm-0.3.11-py3-none-any.whl.

File metadata

  • Download URL: elemm-0.3.11-py3-none-any.whl
  • Upload date:
  • Size: 23.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for elemm-0.3.11-py3-none-any.whl
Algorithm Hash digest
SHA256 8c529505fc219f76659fb954d9e0fa6640237977abaed52c44c7eef85a375204
MD5 2770ecff6facefecc75fb14951347b82
BLAKE2b-256 c2b1806f1ac52b1f442c889a32dc36247744bf650d2441e9c3fc2cba55bdddcd

See more details on using hashes here.

Provenance

The following attestation bundles were made for elemm-0.3.11-py3-none-any.whl:

Publisher: workflow.yml on v3rm1ll1on/elemm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page