Skip to main content

OrcaKit SDK - Common utilities and adapters for AI Agent development based on LangGraph

Project description

OrcaKit SDK

A public SDK package for AI Agent development based on LangGraph, providing common utilities, MCP adapters, and compatible LLM service support.

Features

  • 🔧 MCP Adapter: Integrated Model Context Protocol (MCP) client with multi-server configuration support
  • 🤖 Compatible Models: Support for OpenAI-compatible LLM services (e.g., DeepSeek)
  • 🛠️ Utility Functions: Message handling, model loading, and other common utilities
  • 📦 Lightweight: Minimal core dependencies for easy integration

Installation

pip install orcakit-sdk

Or using uv:

uv add orcakit-sdk

Quick Start

Using OpenAI-Compatible Models

from orcakit_sdk import create_compatible_openai_client

# Create client (requires OPENAI_API_KEY and OPENAI_BASE_URL environment variables)
client = create_compatible_openai_client(model_name="gpt-4")

# Or use default model (read from OPENAI_MODEL_NAME environment variable)
client = create_compatible_openai_client()

Using MCP Adapter

import asyncio
from orcakit_sdk import get_mcp_client, get_mcp_tools

async def main():
    # MCP server configuration (JSON format)
    server_configs = '''
    {
        "mcpServers": {
            "filesystem": {
                "command": "npx",
                "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/allowed/files"]
            }
        }
    }
    '''
    
    # Get MCP client
    client = await get_mcp_client(server_configs)
    
    # Get available tools
    tools = await get_mcp_tools(server_configs)
    print(f"Loaded {len(tools)} tools")

asyncio.run(main())

Loading Chat Models

from orcakit_sdk import load_chat_model

# Load model from fully qualified name
model = load_chat_model("openai/gpt-4")

# Or use OpenAI-compatible model
model = load_chat_model("compatible_openai/deepseek-chat")

Processing Message Content

from orcakit_sdk import get_message_text
from langchain_core.messages import HumanMessage

msg = HumanMessage(content="Hello, world!")
text = get_message_text(msg)
print(text)  # Output: Hello, world!

Environment Variables

The SDK supports the following environment variable configurations:

  • OPENAI_API_KEY: OpenAI API key
  • OPENAI_BASE_URL: OpenAI API base URL (for compatible services)
  • OPENAI_MODEL_NAME: Default model name

Example .env file:

OPENAI_API_KEY=sk-xxx
OPENAI_BASE_URL=https://api.deepseek.com/v1
OPENAI_MODEL_NAME=deepseek-chat

API Reference

MCP Adapter

get_mcp_client(server_configs: str) -> MultiServerMCPClient | None

Get or initialize the global MCP client.

Parameters:

  • server_configs: JSON-formatted server configuration string

Returns:

  • MultiServerMCPClient instance or None (if initialization fails)

get_mcp_tools(server_configs: str) -> list[Callable[..., object]]

Get the list of tools provided by the MCP server.

Parameters:

  • server_configs: JSON-formatted server configuration string

Returns:

  • List of tool functions

clear_mcp_cache() -> None

Clear the MCP client and tools cache (mainly for testing).

Model Tools

create_compatible_openai_client(model_name: str | None = None) -> ChatOpenAI

Create an OpenAI-compatible chat model client.

Parameters:

  • model_name: Model name (optional, defaults to environment variable)

Returns:

  • ChatOpenAI client instance

load_chat_model(fully_specified_name: str) -> BaseChatModel

Load a chat model from a fully qualified name.

Parameters:

  • fully_specified_name: String in the format provider/model

Returns:

  • BaseChatModel instance

Utility Functions

get_message_text(msg: BaseMessage) -> str

Extract text content from a message object.

Parameters:

  • msg: LangChain message object

Returns:

  • Extracted text content

Development

Install Development Dependencies

make install-dev

Run Tests

make test

Code Formatting

make format

Code Linting

make lint

Project Structure

orcakit-sdk/
├── src/
│   └── orcakit_sdk/
│       ├── __init__.py          # Package entry point
│       ├── mcp_adapter.py       # MCP adapter
│       ├── model.py             # Model utilities
│       └── utils.py             # Utility functions
├── tests/
│   ├── unit_tests/
│   │   ├── test_utils.py
│   │   └── test_model.py
│   └── integration_tests/
├── pyproject.toml               # Project configuration
├── Makefile                     # Development commands
└── README.md                    # Project documentation

Dependencies

Core dependencies:

  • langgraph >= 0.6.6: LangGraph framework
  • langchain >= 0.2.14: LangChain core library
  • langchain-openai >= 0.1.22: OpenAI integration
  • langchain-mcp-adapters >= 0.1.9: MCP adapter
  • python-dotenv >= 1.0.1: Environment variable management

License

MIT License

Contributing

Issues and Pull Requests are welcome!

Author

William Fu-Hinthorn (jubaoliang@gmail.com)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

orcakit_sdk-0.0.6.tar.gz (23.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

orcakit_sdk-0.0.6-py3-none-any.whl (29.5 kB view details)

Uploaded Python 3

File details

Details for the file orcakit_sdk-0.0.6.tar.gz.

File metadata

  • Download URL: orcakit_sdk-0.0.6.tar.gz
  • Upload date:
  • Size: 23.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for orcakit_sdk-0.0.6.tar.gz
Algorithm Hash digest
SHA256 8476706e602796a19eda4423ff21106551384eaddeda336108073fd171c3bc4e
MD5 f9e36a1bf15e74e48888610516a84e1f
BLAKE2b-256 f59f154fdaf35882e0cbecff858429827704fdb3c1f121a06310248899da8816

See more details on using hashes here.

File details

Details for the file orcakit_sdk-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: orcakit_sdk-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 29.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for orcakit_sdk-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 d29ab5421e182c50d05989198f4b011acfc06f547594f1f40172c60f66429b0d
MD5 258c7011d497f77b9b5eb5d49fa6fe8f
BLAKE2b-256 beecffff0cb65791aead23153cf7ff040b8dd5d17d55bbefce29bd4a8e473bd6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page