Skip to main content

The Model Smart Contract Protocol (MSCP) is a standard protocol that enables LLM applications to interact with EVM-compatible networks.

Project description

Model Smart Contract Protocol (MSCP)

The Model Smart Contract Protocol (MSCP) is a standard protocol that enables LLM applications to interact with EVM-compatible networks.

Features

Component as a service
AI Agent interacts with the network by operating different components.

Fast integration
Component-based design makes it easier to build workflows and accelerates the development of AI applications.

Unified interaction
Use consistent rules and protocols to standardize the calls to contracts with different functions and ensure the consistency of AI interactions.

Dynamic expansion
AI Agent can add custom onchain components with greater flexibility.

EVM compatibility
It can interact with multiple EVM-compatible network contracts at the same time, and has greater adaptability in handling tasks in complex scenarios.

Decentralization
Access component capabilities without permission, share onchain data, and provide persistent services and information verification.

Architecture

MSCP consists of three parts:

Component: This is an on-chain component that complies with EIP-7654. It is used to implement the specific functions of the contract and provide custom services.

Connector: This is a method and specification for parsing Components and processing contract component requests.

Chat2Web3: This is an interoperator, which is used to automatically convert the interaction methods of contract components into tool functions that LLM can call. ​

Quick Start

Install

pip3 install mscp

Set up environment variables

Please refer to .env.example file, and create a .env file with your own settings. You can use two methods to import environment variables.

Deploy Component Smart Contract

Here is a simple component example.sol that you can deploy on any network.

More about components:

Integrate MSCP into your AI application

from openai import OpenAI
from eth_account import Account
from mscp import Connector, Chat2Web3
from dotenv import load_dotenv
import os

load_dotenv()
# Create a connector to connect to the component
component = Connector(
    "https://sepolia.base.org",  # RPC of the component network
    "0xd08dC2590B43bbDA7bc1614bDf80877EffE72CF0",  # component address
)

# Get the methods of the component
methods = component.get_methods()


#Import the private key from the environment variable
account = Account.from_key(os.getenv("EVM_PRIVATE_KEY"))

# Create a Chat2Web3 instance
chat2web3 = Chat2Web3(account)

# Add a method to the Chat2Web3
chat2web3.add(
    name="getUserInfoByAddress",
    prompt="When a user wants to get a user's name and age, it will return 2 values: one is the name, and the other is the age.",
    method=methods["getUser"],  # Use the getUser method from the contract
)

# Create a client for OpenAI
client = OpenAI(api_key=os.getenv("OPENAI_KEY"), base_url=os.getenv("OPENAI_API_BASE"))

# Set up the conversation
messages = [
    {
        "role": "user",
        "content": "What is the user's name and age? 0xbdbf9715aedc12712daac033d4952280d1d29ac3",
    }
]

# Add the chat2web3 to the tools
params = {
    "model": "gpt-3.5-turbo",
    "messages": messages,
    "tools": chat2web3.functions,
}

# Start the conversation
response = client.chat.completions.create(**params)

# Get the function message
func_msg = response.choices[0].message

# fliter out chat2web3 function
if func_msg.tool_calls and chat2web3.has(func_msg.tool_calls[0].function.name):

    # execute the function from llm
    function_result = chat2web3.call(func_msg.tool_calls[0].function)

    messages.extend(
        [
            func_msg,
            {
                "role": "tool",
                "tool_call_id": func_msg.tool_calls[0].id,
                "content": function_result,
            },
        ]
    )

    # Model responds with final answer
    response = client.chat.completions.create(model="gpt-3.5-turbo", messages=messages)

    print(response.choices[0].message.content)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mscp-0.0.1.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mscp-0.0.1-py3-none-any.whl (6.3 kB view details)

Uploaded Python 3

File details

Details for the file mscp-0.0.1.tar.gz.

File metadata

  • Download URL: mscp-0.0.1.tar.gz
  • Upload date:
  • Size: 6.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for mscp-0.0.1.tar.gz
Algorithm Hash digest
SHA256 62f7c180fd5007ff055888d0ed74ad56af3f8ae2303a4449e3b2fbd6a3fcf5bf
MD5 e2f86691f8e4266d76052d791f934221
BLAKE2b-256 4003b84162d45b0fe4cf4afdea5b369675ba55e23634a36bee814de0b86955d9

See more details on using hashes here.

File details

Details for the file mscp-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: mscp-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 6.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for mscp-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1405e07d995a63cc97215ed581f0300bd29199c94cdda5f9083d4be1737391d9
MD5 398ffbb8d938243dc926098aecacadbe
BLAKE2b-256 16415aade44edcb4bc5e462dc9a658e5e4e039b082accdb8ae57b3552e94b111

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page