Skip to main content

FlowGenX SDK - Intelligent gateway for AI agents to dynamically connect to any MCP server

Project description

FlowGenX SDK

A Python SDK for connecting AI agents to FlowGenX MCP servers. Enables seamless integration with LangChain, AutoGen, CrewAI, and other AI frameworks.

Features

  • Dynamic Discovery - Auto-discover servers and tools across your tenant
  • Framework Adapters - Native support for LangChain, AutoGen, and CrewAI
  • MCP Gateway - Use with Claude Code and Cursor as an MCP server
  • Async/Sync Support - Both async and sync clients available
  • Connection Pooling - Efficient resource management

Installation

# Basic installation
pip install flowgenx-sdk

# With LangChain support
pip install flowgenx-sdk[langchain]

# With all frameworks
pip install flowgenx-sdk[all]

Usage with Claude Code

Add to ~/.claude/mcp.json:

{
  "mcpServers": {
    "flowgenx": {
      "command": "uv",
      "args": ["run", "flowgenx"],
      "env": {
        "FLOWGENX_TENANT": "your-tenant",
        "FLOWGENX_ENVIRONMENT": "your-environment",
        "FLOWGENX_API_KEY": "your-api-key"
      }
    }
  }
}

For specific server mode, add FLOWGENX_SERVER:

{
  "mcpServers": {
    "flowgenx": {
      "command": "uv",
      "args": ["run", "flowgenx"],
      "env": {
        "FLOWGENX_TENANT": "your-tenant",
        "FLOWGENX_ENVIRONMENT": "your-environment",
        "FLOWGENX_SERVER": "your-server-id",
        "FLOWGENX_API_KEY": "your-api-key"
      }
    }
  }
}

Usage with Cursor

Add to ~/.cursor/mcp.json:

{
  "mcpServers": {
    "flowgenx": {
      "command": "uv",
      "args": ["run", "flowgenx"],
      "env": {
        "FLOWGENX_TENANT": "your-tenant",
        "FLOWGENX_ENVIRONMENT": "your-environment",
        "FLOWGENX_API_KEY": "your-api-key"
      }
    }
  }
}

Python SDK

Environment Setup

Set your FlowGenX credentials:

export FLOWGENX_TENANT="your-tenant"
export FLOWGENX_ENVIRONMENT="your-environment"
export FLOWGENX_API_KEY="your-api-key"
export FLOWGENX_SERVER="your-server-id"  # Optional: for specific server mode

Or use a .env file (copy from .env.example).

Basic Usage

import asyncio
import os
from flowgenx_sdk import FlowGenXGateway

async def main():
    async with FlowGenXGateway(
        tenant=os.environ["FLOWGENX_TENANT"],
        environment=os.environ["FLOWGENX_ENVIRONMENT"],
        server_id=os.environ["FLOWGENX_SERVER"],
        api_key=os.environ["FLOWGENX_API_KEY"]
    ) as gateway:
        # List available tools
        tools = await gateway.list_tools()
        for tool in tools:
            print(f"- {tool.name}: {tool.description}")

        # Call a tool
        result = await gateway.call_tool("tool_name", {"param": "value"})
        print(result)

asyncio.run(main())

Dynamic Discovery

async def main():
    gateway = FlowGenXGateway(
        tenant=os.environ["FLOWGENX_TENANT"],
        environment=os.environ["FLOWGENX_ENVIRONMENT"],
        api_key=os.environ["FLOWGENX_API_KEY"]
    )

    # Discover available servers
    servers = await gateway.discover_servers()
    for server in servers:
        print(f"Server: {server.name} (ID: {server.id})")

    # Connect to a server
    await gateway.connect(server_id=servers[0].id)
    tools = await gateway.list_tools()

Framework Integration

LangChain / LangGraph

from flowgenx_sdk import FlowGenXGateway
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent

async with FlowGenXGateway(
    tenant=os.environ["FLOWGENX_TENANT"],
    environment=os.environ["FLOWGENX_ENVIRONMENT"],
    server_id=os.environ["FLOWGENX_SERVER"],
    api_key=os.environ["FLOWGENX_API_KEY"]
) as gateway:
    tools = gateway.as_langchain_tools()
    llm = ChatOpenAI(model="gpt-4o-mini")
    agent = create_react_agent(llm, tools)

    result = await agent.ainvoke({
        "messages": [("user", "Your query here")]
    })

AutoGen

from flowgenx_sdk import FlowGenXGateway
from autogen import AssistantAgent, UserProxyAgent

async with FlowGenXGateway(
    tenant=os.environ["FLOWGENX_TENANT"],
    environment=os.environ["FLOWGENX_ENVIRONMENT"],
    server_id=os.environ["FLOWGENX_SERVER"],
    api_key=os.environ["FLOWGENX_API_KEY"]
) as gateway:
    tool_defs, function_map = gateway.as_autogen_tools()

    assistant = AssistantAgent("assistant", llm_config={"tools": tool_defs})
    user_proxy = UserProxyAgent("user_proxy", function_map=function_map)

CrewAI

from flowgenx_sdk import FlowGenXGateway
from crewai import Agent, Task, Crew

async with FlowGenXGateway(
    tenant=os.environ["FLOWGENX_TENANT"],
    environment=os.environ["FLOWGENX_ENVIRONMENT"],
    server_id=os.environ["FLOWGENX_SERVER"],
    api_key=os.environ["FLOWGENX_API_KEY"]
) as gateway:
    tools = gateway.as_crewai_tools()

    agent = Agent(role="Assistant", goal="Help users", tools=tools)
    task = Task(description="Your task", agent=agent)
    crew = Crew(agents=[agent], tasks=[task])
    result = crew.kickoff()

API Reference

FlowGenXGateway

The main interface for connecting to FlowGenX MCP servers.

class FlowGenXGateway:
    def __init__(
        tenant: str,              # Tenant identifier
        environment: str,         # Environment name
        api_key: str,             # API key for authentication
        server_id: Optional[str], # Server ID (optional for discovery mode)
    )

    # Connection
    async def connect(server_id: Optional[str] = None) -> FlowGenXGateway
    async def disconnect() -> None

    # Discovery
    async def discover_servers() -> List[MCPServerInfo]
    async def search_tools(query: str) -> List[ToolSearchResult]

    # Tools
    async def list_tools() -> List[MCPToolConfig]
    async def call_tool(name: str, arguments: Dict) -> Dict

    # Framework adapters
    def as_langchain_tools() -> List[BaseTool]
    def as_autogen_tools() -> Tuple[List[Dict], Dict]
    def as_crewai_tools() -> List[Tool]

Error Handling

from flowgenx_sdk.exceptions import (
    FlowGenXError,       # Base exception
    ConnectionError,     # Connection failures
    AuthenticationError, # Invalid API key
    ToolExecutionError,  # Tool call failures
    ServerNotFoundError, # Server not found
)

try:
    async with FlowGenXGateway(...) as gateway:
        result = await gateway.call_tool("my_tool", {"param": "value"})
except ConnectionError as e:
    print(f"Connection failed: {e}")
except AuthenticationError:
    print("Invalid API key")
except ToolExecutionError as e:
    print(f"Tool '{e.tool_name}' failed: {e}")

Security

  • Never commit API keys - Use environment variables or .env files
  • Rotate keys regularly - Generate new keys periodically
  • Use separate keys - Different keys for dev/staging/production

See SECURITY.md for detailed security guidelines.

Examples

See the examples/ directory for complete examples:

  • basic_usage.py - Basic connection and tool usage
  • langchain_agent.py - LangChain/LangGraph integration
  • crewai_agent.py - CrewAI integration

License

MIT License - see LICENSE for details.

Support

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flowgenx_sdk-0.2.0.tar.gz (55.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

flowgenx_sdk-0.2.0-py3-none-any.whl (49.8 kB view details)

Uploaded Python 3

File details

Details for the file flowgenx_sdk-0.2.0.tar.gz.

File metadata

  • Download URL: flowgenx_sdk-0.2.0.tar.gz
  • Upload date:
  • Size: 55.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.5

File hashes

Hashes for flowgenx_sdk-0.2.0.tar.gz
Algorithm Hash digest
SHA256 f6170a5d92c9ef1d61c40513897e7da03e66b68c72e6dea1627051e1adf7a66d
MD5 5349ce004a915e55ab737586171d8616
BLAKE2b-256 d34c6365418c7b8285e833d9598b93d1433e68a1299f236604322d802b57fc6a

See more details on using hashes here.

File details

Details for the file flowgenx_sdk-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for flowgenx_sdk-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 141c8fc28b57601246be976e59d122d11e09838f3c0d67829cc4827ee0740cdd
MD5 ebf1d1a44058508efaaf23e5d9295299
BLAKE2b-256 9ad3c77836fa013733f5d2fa886109eff80118af63dc70ee271f442738e1ca37

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page