Skip to main content

Pica LangChain SDK

Project description

pica-langchain

pypi version

Pica LangChain Banner

A Python package for integrating Pica with LangChain.

Full Documentation: https://docs.picaos.com/sdk/langchain

Installation

pip install pica-langchain

Usage

The PicaClientOptions class allows you to configure the Pica client with the following options:

Option Type Required Default Description
server_url str No https://api.picaos.com URL for self-hosted Pica server.
connectors List[str] No [] List of connector keys to filter by. Pass ["*"] to initialize all available connectors, or specific connector keys to filter. If empty, no connections will be initialized.
actions List[str] No None List of action ids to filter by. Default is all actions.
permissions Literal["read", "write", "admin"] No None Permission level to filter actions by. 'read' allows GET only, 'write' allows POST/PUT/PATCH, 'admin' allows all methods (default: 'admin')
authkit bool No False If True, the SDK will use Authkit to connect to prompt the user to connect to a platform that they do not currently have access to
identity str No None Filter connections by specific identity ID.
identity_type "user", "team", "organization", or "project" No None Filter connections by identity type.

The create_pica_agent function allows customizing the following parameters:

Option Type Required Default Description
verbose bool No False Whether to print verbose logs.
system_prompt str No None A custom system prompt to append to the default system prompt.
agent_type AgentType No OPENAI_FUNCTIONS The type of agent to create.
tools List[BaseTool] No None A list of tools to use in the agent.
return_intermediate_steps bool No False Whether to return intermediate steps.

Quick Start

from langchain_openai import ChatOpenAI
from langchain.agents import AgentType
from pica_langchain import PicaClient, create_pica_agent
from pica_langchain.models import PicaClientOptions

# Initialize the Pica client
pica_client = PicaClient(
    secret="your-pica-secret",
    options=PicaClientOptions(
        # server_url="https://my-self-hosted-server.com",
        # identity_type="user"
        # identity="user-id",
        # authkit=True,
        # actions=[""], # Initialize specific action ids (e.g. ["conn_mod_def::F_JeJ_A_TKg::cc2kvVQQTiiIiLEDauy6zQ"])
        # permissions="read", # Filter actions by permission level
        
        connectors=["*"] # Initialize all available connections for this example
    )
)

pica_client.initialize()

# Create a LangChain agent with Pica tools
llm = ChatOpenAI(
    temperature=0, 
    model="gpt-4.1"
)

# Create an agent with Pica tools
agent = create_pica_agent(
    client=pica_client,
    llm=llm,
    agent_type=AgentType.OPENAI_FUNCTIONS,
    # return_intermediate_steps=True, # Optional: Return intermediate steps

    # Optional: Custom system prompt to append
    system_prompt="Always start your response with `Pica works like ✨\n`"
)

# Use the agent
result = agent.invoke({
    "input": (
            "Star the picahq/pica repo in github. "
            "Then, list 5 of the repositories that I have starred in github."
    )
})

print(result)

Using Individual Tools

from langchain.agents import AgentType, initialize_agent
from langchain_openai import ChatOpenAI
from pica_langchain import PicaClient, create_pica_tools

# Initialize the Pica client
pica_client = PicaClient(secret="your-pica-secret")

pica_client.initialize()

# Create Pica tools
tools = create_pica_tools(pica_client)

# Create a custom agent with the tools
llm = ChatOpenAI(temperature=0, model="gpt-4.1")
agent = initialize_agent(
    tools,
    llm,
    agent=AgentType.OPENAI_FUNCTIONS
)

# Use the agent
result = agent.run("What actions are available in Gmail?")
print(result)

Using Model Context Protocol (MCP) Tools

The SDK supports integration with Model Context Protocol (MCP) servers, allowing you to connect to external tool providers via the MCP protocol.

import asyncio
from langchain.agents import AgentType
from langchain_openai import ChatOpenAI
from pica_langchain import PicaClient, create_pica_agent
from pica_langchain.models import PicaClientOptions

# Configure MCP servers
mcp_options = {
    "math": {
        "command": "python",
        "args": ["./path/to/math_server.py"],
        "transport": "stdio",
    },
    "weather": {
        "url": "http://localhost:8000/sse",
        "transport": "sse",
    }
}

async def main():
    # Create client with async initialization
    pica_client = await PicaClient.create(
        secret="your-pica-secret",
        options=PicaClientOptions(
            connectors=["*"],  # Initialize all available connections
            mcp_options=mcp_options,  # Add MCP server options
        ),
    )

    pica_client.initialize()

    # Create an agent with both Pica and MCP tools
    llm = ChatOpenAI(temperature=0, model="gpt-4.1")
    agent = create_pica_agent(
        client=pica_client,
        llm=llm,
        agent_type=AgentType.OPENAI_FUNCTIONS,
    )

    # Use both Pica platform actions and MCP tools
    result = await agent.ainvoke({
        "input": "Calculate 25 * 17, then check the weather in New York, finally list all connectors Pica supported"
    })
    
    print(result["output"])

if __name__ == "__main__":
    asyncio.run(main())

Development

Setup

  1. Clone the repository:
git clone https://github.com/yourusername/pica-langchain.git
cd pica-langchain
  1. Create and activate a virtual environment:
python -m venv venv
source venv/bin/activate
  1. Create a connection on Pica:
1. Create an account on app.picaos.com.
2. Navigate to the "My Connections" tab and create the required connection.
3. Retrieve your API Key from the "API Keys" section.
  1. Export required environment variables:
export PICA_SECRET="your-pica-secret"
export OPENAI_API_KEY="your-openai-api-key"
  1. Install development dependencies:
pip install -e ".[dev]"

Running Tests

pytest

Logging

The Pica LangChain SDK uses the logging module to log messages. The log level can be set using the PICA_LOG_LEVEL environment variable.

The following log levels are available:

  • debug
  • info
  • warning
  • error
  • critical
export PICA_LOG_LEVEL="debug"

Examples

Examples can be found in the examples directory.

> python3 examples/use_with_langchain.py # LangChain agent example

License

This project is licensed under the GNU General Public License v3.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pica_langchain-2.1.1.tar.gz (43.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pica_langchain-2.1.1-py3-none-any.whl (47.2 kB view details)

Uploaded Python 3

File details

Details for the file pica_langchain-2.1.1.tar.gz.

File metadata

  • Download URL: pica_langchain-2.1.1.tar.gz
  • Upload date:
  • Size: 43.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pica_langchain-2.1.1.tar.gz
Algorithm Hash digest
SHA256 8f79eff35d49ea35a1e3ad5196b4fd5534d8c278af38d23c21a50d89207ce631
MD5 51021da5caa66b2ae7a1fbaa13c51264
BLAKE2b-256 535ab5619e776f1d3ae916a6c9ccc1056a159775548ebb0c8b6e7be221668086

See more details on using hashes here.

File details

Details for the file pica_langchain-2.1.1-py3-none-any.whl.

File metadata

  • Download URL: pica_langchain-2.1.1-py3-none-any.whl
  • Upload date:
  • Size: 47.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pica_langchain-2.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d52f4406959a0c5acef4986d78db7f160fbf0ea6175ad146a42a08cefb929925
MD5 a88aa54d11ec5deec692c9bcd8569e60
BLAKE2b-256 90a4f094c055479d035638cf6e8f298c58f486ec79c28f40b394f8bf1d7e550a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page