Skip to main content

AI-powered natural language interface for REST APIs with OpenAPI support and real-time streaming

Project description

Enable AI

Natural language interface for REST APIs with MCP server support

Transform natural language queries into API calls using a LangGraph-powered workflow.


🎯 Overview

enable-ai is a Python library that understands natural language and automatically:

  • Matches queries to APIs - "list all users" → GET /users/
  • Authenticates automatically - Handles JWT, OAuth, API keys
  • Extracts parameters - "get user 5" → GET /users/5/
  • Returns structured data - Clean JSON responses
  • Manages state with LangGraph - Enables step re-runs and back-and-forth
  • Exposes MCP server - Integrate with AI assistants like Claude Desktop

Use Cases:

  • Build natural language interfaces for your APIs
  • Create AI-powered chatbots for customer support
  • Integrate with SaaS platforms for AI-driven workflows

Current scope: API-only. Database and document features are planned and documented as future extensions.


🚀 Installation & Setup

Step 1: Install the Package

pip install enable-ai

Or for development:

git clone https://github.com/EnableEngineering/enable_ai.git
cd enable_ai
pip install -e .

Step 2: Create Configuration Files

The module automatically detects config.json and .env from your working directory.

config.json - Define your data sources

{
  "data_sources": {
    "api": {
      "type": "api",
      "enabled": true,
      "base_url": "http://localhost:8002/api",
      "schema_path": "schemas/api_schema.json"
    }
  },
  "security_credentials": {
    "api": {
      "jwt": {
        "enabled": true,
        "token_endpoint": "/token/",
        "username_field": "email",
        "password_field": "password",
        "env": {
          "username": "API_EMAIL",
          "password": "API_PASSWORD"
        }
      }
    }
  }
}

.env - Store credentials securely

OPENAI_API_KEY=sk-proj-your-key-here
API_EMAIL=admin@example.com
API_PASSWORD=your_password

Important: Add .env to your .gitignore!


📖 Usage Guide

1. Python Library Usage

from enable_ai import NLPProcessor

# Initialize (auto-detects config.json and .env from current directory)
processor = NLPProcessor()

# Process natural language queries
result = processor.process("list all users")

print(result['summary'])  # Natural language summary
print(result['data'])     # Structured data from API

Advanced Usage - Custom Config

# Use specific config path
processor = NLPProcessor(config_path="/path/to/config.json")

# Pass custom config dictionary
config = {
    "data_sources": {
        "api": {
            "type": "api",
            "enabled": True,
            "base_url": "http://api.example.com"
        }
    }
}
processor = NLPProcessor(config=config)

# Override authentication token
result = processor.process(
    "list all users",
    access_token="your_jwt_token_here"
)

2. MCP Server Usage

Run as a Model Context Protocol (MCP) server for AI assistants:

# Start MCP server (auto-detects config from current directory)
python3 -m enable_ai.mcp_server

Test with MCP Inspector

# Install MCP inspector
npm install -g @modelcontextprotocol/inspector

# Launch inspector
cd /path/to/your-backend
npx @modelcontextprotocol/inspector python3 -m enable_ai.mcp_server

Integrate with Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "enable_ai": {
      "command": "python3",
      "args": ["-m", "enable_ai.mcp_server"],
      "cwd": "/path/to/your-backend"
    }
  }
}

Now Claude can process natural language queries against your APIs!

3. Command Line Usage

# Quick test from command line
cd /path/to/your-backend
python3 -c "
from enable_ai import NLPProcessor
proc = NLPProcessor()
result = proc.process('list all users')
print(result['summary'])
"

🏗️ Architecture

User Query: "list all users"
         ↓
    LangGraph Workflow
         ↓
    Parser (LLM-powered)
         ↓
    Intent + Parameters
         ↓
    Matcher (API only)
         ↓
    Execution Plan
         ↓
    Authentication (JWT/OAuth/API Key)
         ↓
    Execute Query
         ↓
    Results + Summary

📦 Module Components

Core Modules

orchestrator.py - Main orchestrator

The central processing engine that coordinates all operations via a LangGraph workflow. Handles query parsing, authentication, execution planning, and response summarization. This is your main entry point via NLPProcessor class.

workflow.py - LangGraph pipeline

Defines the stateful workflow for parsing, planning, executing, and summarizing API calls. Enables step re-runs and back-and-forth when required.

query_parser.py - Natural language understanding

Converts user queries into structured intents using OpenAI GPT-4. Extracts entities (IDs, names, dates), determines actions (list, get, create, update, delete), and identifies target resources.

types.py - Type definitions and data structures

Defines type-safe classes for requests, responses, and errors. Includes APIRequest, APIResponse, APIError, and authentication credential structures.

Data Source Matchers

api_matcher.py - REST API matching

Matches parsed queries to REST API endpoints from OpenAPI/custom schemas. Handles path parameters, query strings, request bodies, and HTTP methods (GET, POST, PUT, DELETE, PATCH).

database_matcher.py - Database query generation (planned)

Database support is planned for a future release; the current pipeline focuses on APIs only.

knowledge_graph_matcher.py - Document/RAG search (planned)

Knowledge graph support is planned for a future release; the current pipeline focuses on APIs only.

Utilities

api_client.py - HTTP request handler

Executes REST API calls with automatic retry logic, timeout handling, and error management. Supports all HTTP methods and authentication schemes.

config_loader.py - Configuration management

Loads and validates configuration from JSON files or dictionaries. Handles environment variable substitution and schema path resolution.

mcp_server.py - MCP protocol server

Exposes the NLP processor through Model Context Protocol for integration with AI assistants like Claude Desktop. Provides 4 tools: process_query, get_schema_resources, authenticate, get_config_info.

Schema Generation

schema_generator/ - Automatic schema creation

Tools to automatically generate schemas from various sources:

  • schema_converter.py - Convert OpenAPI specs to internal format (supported)
  • database_inspector.py - Introspect database schemas (planned)
  • pdf_analyzer.py - Extract structure from PDF documents (planned)
  • json_analyzer.py - Analyze JSON APIs automatically (planned)
  • cli.py - Command-line interface for schema generation

🔍 Auto-Detection

The module automatically finds configuration files from your working directory:

Priority Order

  1. Current working directory - ./config.json, ./.env (highest priority)
  2. Environment variables - $NLP_CONFIG_PATH
  3. User home directory - ~/.enable_ai/config.json
  4. Package defaults - Bundled examples

Verification

# Test auto-detection
cd /path/to/your-backend
python3 << 'EOF'
import sys; sys.stderr = sys.stdout
from enable_ai.mcp_server import DEFAULT_CONFIG_PATH, DEFAULT_ENV_PATH
print(f"Config: {DEFAULT_CONFIG_PATH}")
print(f"Env: {DEFAULT_ENV_PATH}")
EOF

Expected output:

✓ Loaded .env from: /path/to/your-backend/.env
✓ Found config.json at: /path/to/your-backend/config.json

🔐 Authentication Support

JWT (JSON Web Tokens)

Automatically obtains and refreshes JWT tokens using credentials from .env.

OAuth 2.0

Supports client credentials and authorization code flows.

API Keys

Loads API keys from environment variables and includes them in request headers.

Manual Tokens

Pass tokens explicitly: processor.process("query", access_token="token")


🗃️ Schema Examples

API Schema (OpenAPI format)

{
  "type": "api",
  "resources": {
    "users": {
      "description": "User management endpoints",
      "endpoints": [
        {
          "path": "/users/",
          "method": "GET",
          "description": "List all users"
        },
        {
          "path": "/users/{id}/",
          "method": "GET",
          "description": "Get user by ID"
        }
      ]
    }
  }
}

Database Schema (planned)

{
  "type": "database",
  "tables": {
    "users": {
      "description": "User accounts table",
      "columns": {
        "id": {"type": "INTEGER", "primary_key": true},
        "email": {"type": "VARCHAR"},
        "name": {"type": "VARCHAR"}
      }
    }
  }
}

🧪 Testing

# Run tests
pytest tests/

# Run specific test
python tests/test_processor_query.py

# Test with real API
python tests/test_api_endpoint.py

📊 Example Queries

Natural Language Result
"list all users" GET /users/ → Returns user list
"get user 5" GET /users/5/ → Returns user details
"show me service orders with high priority" Filters service orders by priority
"create a new user with email test@example.com" POST /users/ → Creates user
"find documents about machine learning" Semantic search in knowledge base

🛠️ Development

Generate Schemas Automatically

# From OpenAPI spec
python -m enable_ai.schema_generator.cli \
  --source openapi \
  --input swagger.json \
  --output schemas/api_schema.json

# From database (planned)
python -m enable_ai.schema_generator.cli \
  --source database \
  --connection-string "postgresql://localhost/db" \
  --output schemas/db_schema.json

# From PDFs (planned)
python -m enable_ai.schema_generator.cli \
  --source pdf \
  --input documents/ \
  --output schemas/knowledge_graph.json

🌐 Use Cases

1. Customer Support Chatbot

processor = NLPProcessor()
user_query = "Show me my recent orders"
result = processor.process(user_query, access_token=user_token)
# Returns order history automatically

2. Internal Tools (planned)

# Let employees query databases naturally (planned)
result = processor.process("How many users signed up this month?")

3. API Documentation Assistant

# Help developers discover APIs
result = processor.process("What user endpoints are available?")

4. SaaS Integration

# Deploy as MCP server for AI assistant integration
# Claude Desktop, custom agents, etc.

🤝 Contributing

Contributions are welcome! Please:

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests
  5. Submit a pull request

📄 License

MIT License - See LICENSE file for details


🔗 Resources


💡 Quick Start Summary

# 1. Install
pip install enable-ai

# 2. Create config.json and .env in your project

# 3. Use it
python3 -c "
from enable_ai import NLPProcessor
proc = NLPProcessor()
print(proc.process('list all users')['summary'])
"

That's it! The module handles authentication, API matching, and execution automatically. 🚀

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

enable_ai-0.3.3.tar.gz (57.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

enable_ai-0.3.3-py3-none-any.whl (64.7 kB view details)

Uploaded Python 3

File details

Details for the file enable_ai-0.3.3.tar.gz.

File metadata

  • Download URL: enable_ai-0.3.3.tar.gz
  • Upload date:
  • Size: 57.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.6

File hashes

Hashes for enable_ai-0.3.3.tar.gz
Algorithm Hash digest
SHA256 23e7d077caba9ef2d3f5782b63c17c431ffb00cca24ed497ce19bf67b0bc921f
MD5 e6e0c21ce1e3be8411358a9982230431
BLAKE2b-256 6967bc4711219a26705b04fee3501e14ecbed47ae36dccc31e0b667c216a5025

See more details on using hashes here.

File details

Details for the file enable_ai-0.3.3-py3-none-any.whl.

File metadata

  • Download URL: enable_ai-0.3.3-py3-none-any.whl
  • Upload date:
  • Size: 64.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.6

File hashes

Hashes for enable_ai-0.3.3-py3-none-any.whl
Algorithm Hash digest
SHA256 2eb2a5041f2f54042ef97efd0ae72b08c38f452fea8e6a1e53a051c6bdab7c90
MD5 954005e60c2a616e44de71494266a72b
BLAKE2b-256 0343473d48790e3a5bc60d0d2f060ff43457bd6766ed8c95a8417061ca702f18

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page