Skip to main content

A protocol for autonomous LLM agents to navigate web applications via semantic landmarks.

Project description

elemm (Landmark Protocol)

The Universal AI-Native Backend Bridge. Turn any API into native AI tools in seconds.

elemm is a high-performance framework centered around the Model Context Protocol (MCP). It transforms standard REST endpoints into "AI Landmarks", enabling autonomous agents (like Claude or GPT) to discover and interact with your backend with zero-shot precision.


Core Strengths:

  • Plug-and-play MCP Support: Instantly compatible with Claude Desktop and Cursor.
  • Automated Discovery: Auto-detects landmarks, schemas, and dependencies.
  • AI-Native Safety: Built-in protection against hallucinations and security exposures.

Quick Start

1. Installation

pip install elemm

2. Implementation

from elemm import FastAPIProtocolManager
from fastapi import FastAPI, Depends
from fastapi.security import HTTPBearer
from pydantic import BaseModel

app = FastAPI()
ai = FastAPIProtocolManager(agent_welcome="Welcome to the Support-OS.")
auth_scheme = HTTPBearer()

class Ticket(BaseModel):
    title: str
    priority: int = 1

@ai.landmark(id="get_categories", type="navigation")
@app.get("/categories")
async def list_cats():
    return ["Tech", "Billing", "General"]

@ai.landmark(id="create_ticket", type="write", remedy="If 400, ask for a clearer title.")
@app.post("/tickets")
async def create(ticket: Ticket, token: str = Depends(auth_scheme)):
    return {"id": "123", "status": "created"}

# Register and bind
app.include_router(ai.get_router())
ai.bind_to_app(app)

3. Advanced: Multiple Auth Schemes

elemm detects any dependency that inherits from SecurityBase. You can mix and match different schemes like API Keys and OAuth2:

from fastapi.security import APIKeyHeader

api_key_scheme = APIKeyHeader(name="X-Admin-Key", description="Admin access only")

@ai.landmark(id="wipe_logs", type="write", instructions="High-security action.")
@app.delete("/admin/logs")
async def delete_logs(key: str = Depends(api_key_scheme)):
    return {"status": "logs cleared"}

elemm will automatically mark the X-Admin-Key as managed_by: protocol in the manifest.


Why elemm? (OpenAPI vs. Landmark Manifest)

Standard openapi.json is built for humans and documentation. It is full of HTTP-noise that confuses LLMs. elemm produces a "Hardened Manifest" optimized for action.

Feature Standard OpenAPI elemm Landmark
Noise Level High (Responses, Content-Types, etc.) Low (Action-First)
Tool Calling Complex paths & methods Unique Action-IDs
Security AI must handle tokens (Risky/Noisy) Automated Security Detection (Native Support for HTTPBearer, OAuth2, APIKey)
Managed Auth Manual configuration needed Zero-Config: Auto-detects SecurityBase
Error Handling Generic 4xx/5xx Functional remedy instructions
Context Exposes internal fields (Request/Sess) Clean Context-Isolation

Comparison Example: create_ticket

Standard openapi.json (Noisy)

"/tickets": {
  "post": {
    "summary": "Create",
    "operationId": "create_tickets_post",
    "parameters": [{ "name": "auth", "in": "header", "required": true, "schema": { "type": "string" } }],
    "requestBody": { "content": { "application/json": { "schema": { "$ref": "#/components/schemas/Ticket" } } } },
    "responses": { "200": { "description": "Successful Response", "content": { "application/json": {} } } }
  }
}

llm-landmarks.json (Optimized)

{
  "id": "create_ticket",
  "type": "write",
  "description": "No description provided.",
  "remedy": "If 400, ask for a clearer title.",
  "method": "POST",
  "url": "/tickets",
  "parameters": [
    { "name": "auth", "type": "string", "required": true, "managed_by": "protocol" }
  ],
  "payload": [
    { "name": "title", "type": "string", "required": true },
    { "name": "priority", "type": "integer", "required": false, "default": 1 }
  ]
}

The @landmark Decorator Reference

The decorator is the primary way to provide semantic context to an AI agent. Here is a detailed breakdown of all available options:

id (string, required)

The unique identifier for the action. The AI uses this ID to call the tool.

  • Example: id="process_payment"

type (string, required)

Defines the nature of the action. This helps the AI categorize its capabilities:

  • read: For fetching information (e.g., searching products, reading logs).
  • write: For actions that change state (e.g., creating a user, deleting a file).
  • navigation: For actions that explain the API's structure (e.g., listing categories, help endpoints).
  • Example: type="write"

description (string, optional)

The semantic instruction for the AI. If omitted, the function's docstring is used. This is the most important field for AI reasoning.

  • Example: description="Use this to find products by price range or keywords."

instructions (string, optional)

Specific "Rules of Engagement" for this action. Useful for enforcing business logic at the AI level.

  • Example: instructions="Always ask for the user's shipping address BEFORE calling this."

remedy (string, optional)

AI-Native Error Handling. Instructions for the agent on how to proceed if the API call fails or returns an error.

  • Example: remedy="If this returns a 402 (Payment Required), explain the premium subscription benefits to the user."

hidden (boolean, default: False)

If set to True, the landmark is registered in your code but excluded from the AI manifest.


Native MCP Support

elemm natively supports the Model Context Protocol (MCP). This allows you to export your entire API as a tool-kit for high-end AI clients in seconds:

  • Instant Integration: Works with Claude Desktop, Cursor, and other MCP-compatible agents.
  • Auto-Sync: Your AI tools are always in sync with your latest API deployment.
  • Bridge Logic: See examples/mcp_bridge.py for a ready-to-use bridge implementation.

Claude Desktop Integration

To connect your Landmarks to Claude, add the following to your claude_desktop_config.json:

{
  "mcpServers": {
    "landmarks": {
      "command": "python3",
      "args": ["/pfad/zu/deinem/repo/examples/mcp_bridge.py"],
      "env": {
        "LANDMARK_URLS": "http://localhost:8000"
      }
    }
  }
}

Practical Applications

elemm is universal. It is designed for any API that needs to be controlled or queried by an AI Assistant:

  • E-Commerce: Automate shopping flows, support, and inventory tracking.
  • Internal Dashboards & ERP: Turn complex data silos into conversational interfaces for employees.
  • IoT & Infrastructure: Control hardware, manage servers, or query sensors via natural language.
  • SaaS & Tooling: Enable your users to interact with your platform via AI agents autonomously.

Essentially, if it has an API, elemm makes it AI-Native.


Core Features and Hardening

Standard OpenAPI documentation is often too noisy for LLMs. elemm provides a hardened abstraction layer:

  • Rocksolid Security: Native support for FastAPI SecurityBase. Sensitive schemes (HTTPBearer, APIKey) are automatically detected and marked as managed_by: protocol.
  • Context Injection: Technical fields (e.g., Request, Session) and internal dependencies are automatically moved to a context scope, keeping the agent's input clean.
  • Pydantic V2 Support: Deep extraction of Enums, nested models, and field metadata.

Agent in Action

How an autonomous agent perceives and uses your elemm landmarks in a real conversation:

Agent Interaction Demo

Note the intelligence: In this example, the agent technically requires a JWT token for the next action. When the user provides simple credentials instead, the agent autonomously understands the context, performs the login, and proceeds with the add_to_cart call. He nailed it.


License

GNU General Public License v3.0. Created by Marc Stöcker.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

elemm-0.3.1.tar.gz (27.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

elemm-0.3.1-py3-none-any.whl (23.6 kB view details)

Uploaded Python 3

File details

Details for the file elemm-0.3.1.tar.gz.

File metadata

  • Download URL: elemm-0.3.1.tar.gz
  • Upload date:
  • Size: 27.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for elemm-0.3.1.tar.gz
Algorithm Hash digest
SHA256 f7ddcf161d73a57fff32b86ea72bcbfbe7448857d06958b9bcfafb6298bfc183
MD5 ae0d19c17e757645730329409c2aac56
BLAKE2b-256 b049e8f47c33ae2ffdf46914f4dab5409e9e2ab8d20058bce8a66dc04b7e1d1b

See more details on using hashes here.

Provenance

The following attestation bundles were made for elemm-0.3.1.tar.gz:

Publisher: workflow.yml on v3rm1ll1on/elemm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file elemm-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: elemm-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 23.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for elemm-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c371699bb2a49faa86928fcfa7cfe34d06fafc0ead7774d57529329b584592e6
MD5 35161c94458f687fbf67c09d654e01e1
BLAKE2b-256 00eee3d5139617acd7a05add33010e158f4a6c0f71be44d795b3922636e5d294

See more details on using hashes here.

Provenance

The following attestation bundles were made for elemm-0.3.1-py3-none-any.whl:

Publisher: workflow.yml on v3rm1ll1on/elemm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page