Skip to main content

The Model Smart Contract Protocol (MSCP) is a standard protocol that enables LLM applications to interact with EVM-compatible networks.

Project description

Model Smart Contract Protocol (MSCP)

A standard protocol that enables LLM applications to interact with EVM-compatible networks.

Version License Powered by

Features

Component as a service
AI Agent interacts with the network by operating different components.

Fast integration
Component-based design makes it easier to build workflows and accelerates the development of AI applications.

Unified interaction
Use consistent rules and protocols to standardize the calls to contracts with different functions and ensure the consistency of AI interactions.

Dynamic expansion
AI Agent can add custom onchain components with greater flexibility.

EVM compatibility
It can interact with multiple EVM-compatible network contracts at the same time, and has greater adaptability in handling tasks in complex scenarios.

Decentralization
Access component capabilities without permission, share onchain data, and provide persistent services and information verification.

Architecture

MSCP Architecture

MSCP consists of three parts:

Component: This is an on-chain component that complies with EIP-7654. It is used to implement the specific functions of the contract and provide custom services.

Connector: This is a method and specification for parsing Components and processing contract component requests.

Chat2Web3: This is an interoperator, which is used to automatically convert the interaction methods of contract components into tool functions that LLM can call. ​

Quick Start

Install

pip3 install mscp

Set up environment variables

Please refer to .env.example file, and create a .env file with your own settings. You can use two methods to import environment variables.

Deploy Component Smart Contract

Here is a simple component example.sol that you can deploy on any network.

Integrate MSCP into your AI application

from openai import OpenAI
from eth_account import Account
from mscp import Connector, Chat2Web3
from dotenv import load_dotenv
import os

load_dotenv()
# Create a connector to connect to the component
component_connector = Connector(
    "http://localhost:8545",  # RPC of the component network
    "0x0E2b5cF475D1BAe57C6C41BbDDD3D99ae6Ea59c7",  # component address
    Account.from_key(os.getenv("EVM_PRIVATE_KEY")),
)

# Create a Chat2Web3 instance
chat2web3 = Chat2Web3([component_connector])

# Create a client for OpenAI
client = OpenAI(api_key=os.getenv("OPENAI_KEY"), base_url=os.getenv("OPENAI_API_BASE"))

# Set up the conversation
messages = [
    {
        "role": "user",
        "content": "What is the user's name and age? 0x8241b5b254e47798E8cD02d13B8eE0C7B5f2a6fA",
    }
]

# Add the chat2web3 to the tools
params = {
    "model": "gpt-3.5-turbo",
    "messages": messages,
    "tools": chat2web3.functions,
}

# Start the conversation
response = client.chat.completions.create(**params)

# Get the function message
func_msg = response.choices[0].message

# fliter out chat2web3 function
if func_msg.tool_calls and chat2web3.has(func_msg.tool_calls[0].function.name):

    # execute the function from llm
    function_result = chat2web3.call(func_msg.tool_calls[0].function)

    messages.extend(
        [
            func_msg,
            {
                "role": "tool",
                "tool_call_id": func_msg.tool_calls[0].id,
                "content": function_result,
            },
        ]
    )

    # Model responds with final answer
    response = client.chat.completions.create(model="gpt-3.5-turbo", messages=messages)

    print(response.choices[0].message.content)

Use MSCP in the aser agent

component_connector = Connector(
    "http://127.0.0.1:8545",
    "0x0E2b5cF475D1BAe57C6C41BbDDD3D99ae6Ea59c7",  
    Account.from_key(os.getenv("EVM_PRIVATE_KEY")) 
)
chat2web3 = Chat2Web3([component_connector])
agent=Agent(name="chat2web3",model="gpt-4o",chat2web3=chat2web3)
response = agent.chat("What is the user's name and age?0x8241b5b254e47798E8cD02d13B8eE0C7B5f2a6fA")

print(response)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mscp-0.1.1.tar.gz (7.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mscp-0.1.1-py3-none-any.whl (7.4 kB view details)

Uploaded Python 3

File details

Details for the file mscp-0.1.1.tar.gz.

File metadata

  • Download URL: mscp-0.1.1.tar.gz
  • Upload date:
  • Size: 7.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for mscp-0.1.1.tar.gz
Algorithm Hash digest
SHA256 9c0aa449343b7b2c67148adcc35b91e2932ed30fbaa0aed8c09cc0493512c9da
MD5 6a4bf14c6ad8396bdb38ad67a5a99bd0
BLAKE2b-256 a4a16ae2f16fbeb311047759767a42322f0f939d705245f652fc0bebbe29ce76

See more details on using hashes here.

File details

Details for the file mscp-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: mscp-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 7.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for mscp-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a77e2c8bcf56903f36f08ef56063df99749fd332d606a22910ea24e7ce1d6f0a
MD5 888c7f6f3ff3d7a4b5c450f0d03d9c9b
BLAKE2b-256 87326b9a062ddda79fcb56665cc6eef52839f4a3b7c977b792748808ebca58a3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page