Skip to main content

OpenTerms protocol integration for CrewAI: permission-aware AI agents

Project description

crewai-openterms

Permission-aware AI agents for CrewAI. Checks a domain's openterms.json before your agent acts, so it knows what it's allowed to do.

Install

pip install crewai-openterms

Two tools

OpenTermsCheckTool

A tool the agent calls to check permissions and get a structured result.

from crewai import Agent, Task, Crew
from crewai_openterms import OpenTermsCheckTool

checker = OpenTermsCheckTool()

researcher = Agent(
    role="Web Researcher",
    goal="Gather information from websites while respecting their terms",
    backstory=(
        "You are a thorough researcher. Before accessing any website, "
        "you always check its openterms.json permissions using the "
        "openterms_check tool."
    ),
    tools=[checker],
)

task = Task(
    description=(
        "Check whether github.com allows read_content and scrape_data. "
        "Report the results."
    ),
    expected_output="A summary of what actions are permitted on github.com.",
    agent=researcher,
)

crew = Crew(agents=[researcher], tasks=[task])
result = crew.kickoff()

OpenTermsGuardTool

A gate that returns clear go/no-go instructions. When a domain allows the action, the response also includes any discovered MCP servers and API specs from the site's discovery block.

from crewai import Agent, Task, Crew
from crewai_tools import ScrapeWebsiteTool
from crewai_openterms import OpenTermsGuardTool

guard = OpenTermsGuardTool()
scraper = ScrapeWebsiteTool()

researcher = Agent(
    role="Compliant Web Scraper",
    goal="Scrape websites for data, but only when permitted",
    backstory=(
        "Before scraping any site, you MUST use openterms_guard to check "
        "whether scraping is allowed. If the tool returns DENIED, do not "
        "proceed. If it returns ALLOWED, you may use the scrape tool."
    ),
    tools=[guard, scraper],
)

task = Task(
    description=(
        "Scrape pricing information from example.com. "
        "First check if scraping is permitted."
    ),
    expected_output="Pricing data or an explanation of why it could not be retrieved.",
    agent=researcher,
)

crew = Crew(agents=[researcher], tasks=[task])
result = crew.kickoff()

Strict mode

By default, if a domain has no openterms.json, the guard returns "PROCEED WITH CAUTION." In strict mode, it blocks instead:

guard = OpenTermsGuardTool(strict=True)

Discovery integration

When a domain's openterms.json includes a discovery block (v0.3.0), the guard tool surfaces MCP server URLs and API spec URLs in its response. This lets the agent know not just whether it can interact, but how:

ALLOWED: acme-corp.com permits 'api_access'. You may proceed.
MCP servers available: https://acme-corp.com/mcp/sse.
API specs available: https://api.acme-corp.com/v1/openapi.json.

Multi-agent example

from crewai import Agent, Task, Crew
from crewai_openterms import OpenTermsCheckTool, OpenTermsGuardTool

# Compliance agent checks permissions first
compliance = Agent(
    role="Compliance Officer",
    goal="Verify that all web interactions are permitted",
    tools=[OpenTermsCheckTool()],
    backstory="You check openterms.json for every domain before any agent acts.",
)

# Research agent does the actual work
researcher = Agent(
    role="Researcher",
    goal="Gather data from permitted sources",
    tools=[OpenTermsGuardTool()],
    backstory=(
        "You research topics online. Always use openterms_guard before "
        "accessing any site. If denied, skip that site and try another."
    ),
)

check_task = Task(
    description="Check permissions for github.com, stripe.com, and reddit.com for read_content and scrape_data.",
    expected_output="Permission matrix for all three domains.",
    agent=compliance,
)

research_task = Task(
    description="Using only permitted domains, gather information about API pricing trends.",
    expected_output="Summary of API pricing trends from permitted sources.",
    agent=researcher,
    context=[check_task],
)

crew = Crew(
    agents=[compliance, researcher],
    tasks=[check_task, research_task],
)

result = crew.kickoff()

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crewai_openterms-0.1.0.tar.gz (9.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crewai_openterms-0.1.0-py3-none-any.whl (8.6 kB view details)

Uploaded Python 3

File details

Details for the file crewai_openterms-0.1.0.tar.gz.

File metadata

  • Download URL: crewai_openterms-0.1.0.tar.gz
  • Upload date:
  • Size: 9.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for crewai_openterms-0.1.0.tar.gz
Algorithm Hash digest
SHA256 a9a6e7fe91992add1bfdd1527cf97f80b40d71c756afb1f5d5efbe98954dd642
MD5 0782ffc217ccc5b739a8c294cfea70a8
BLAKE2b-256 7185aab8280375ef628bc1b97f4d891d5313ba671bf0ddbca8aabaf531c962dd

See more details on using hashes here.

File details

Details for the file crewai_openterms-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for crewai_openterms-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1b2fccd86b17fc253df3b9a43aa097c1840183f8a3c4dea3bc55ef46ad3139ca
MD5 1416376c888dae38236fd95207ff3792
BLAKE2b-256 c4bf098ac6945e3d2c28531fc7bb9e6c494aab4b82f828aa924a16994e57c916

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page