Skip to main content

Nacos extension component for AgentScope - Python SDK

Project description

AgentScope Extensions Nacos

English | 简体中文

An extension component for the AgentScope framework that provides Nacos integration capabilities, supporting dynamic configuration management and MCP tool integration.

✨ Key Features

  • 🔄 Dynamic Configuration Management: Host agent configurations (prompts, model configs, tool lists, etc.) in Nacos for centralized management and real-time hot updates without restarting the application
  • 🛠️ MCP Tool Integration: Automatically discover and register tool servers from the Nacos MCP Registry with dynamic tool list updates
  • 🎯 Multi-Model Support: Support for OpenAI, Anthropic, Ollama, Google Gemini, Alibaba Cloud Qwen, and more

📋 Prerequisites

📝 Version Compatibility

Extension Version AgentScope AgentScope Runtime Nacos Server
1.0.0 >= 1.0.7 >= 1.0.1 >= 3.1.0

Note: Starting from version 1.0.0, the A2A protocol implementation has been removed from this extension. AgentScope now natively supports the A2A protocol with Nacos as the A2A Registry. Please use the built-in A2A support in AgentScope directly.

📦 Installation

pip install agentscope-extension-nacos

Or install from source:

git clone https://github.com/nacos-group/agentscope-extensions-nacos.git
cd agentscope-extensions-nacos/python
pip install -e .

🔧 Configuring Nacos Connection

Before using this extension, you need to configure the Nacos connection information.

Method 1: Environment Variables

# Nacos server address (required)
export NACOS_SERVER_ADDRESS=localhost:8848

# Nacos namespace (required)
export NACOS_NAMESPACE_ID=public

# Local Nacos authentication (optional)
export NACOS_USERNAME=nacos
export NACOS_PASSWORD=nacos

# Or use Alibaba Cloud MSE authentication (optional)
export NACOS_ACCESS_KEY=your-access-key
export NACOS_SECRET_KEY=your-secret-key

Method 2: Code Configuration

from v2.nacos import ClientConfigBuilder
from agentscope_extension_nacos.utils.nacos_service_manager import NacosServiceManager

# Configure Nacos connection
client_config = (ClientConfigBuilder()
				 .server_address("localhost:8848")
				 .namespace_id("public")
				 .username("nacos")
				 .password("nacos")
				 .build())

# Set as global configuration
NacosServiceManager.set_global_config(client_config)

🚀 Usage Scenarios

Scenario 1: Model Configuration Hosting

Host model configuration in Nacos to enable dynamic model switching and parameter adjustment.

1. Create Model Configuration in Nacos

Create the following configuration in the Nacos console:

Group: nacos-ai-model
DataId: {model_key}.json (e.g., my-model.json)
Format: JSON

{
  "modelName": "qwen-max",
  "modelProvider": "dashscope",
  "apiKey": "sk-your-api-key",
  "baseUrl": "https://dashscope.aliyuncs.com/compatible-mode/v1",
  "args": {
    "temperature": 0.7,
    "max_tokens": 2000
  }
}

Supported Model Providers:

  • openai - OpenAI GPT series
  • anthropic - Anthropic Claude series
  • ollama - Ollama local models
  • gemini - Google Gemini
  • dashscope - Alibaba Cloud Qwen

2. Use in Code

import asyncio
from v2.nacos import ClientConfigBuilder
from agentscope_extension_nacos.utils.nacos_service_manager import NacosServiceManager
from agentscope_extension_nacos.model.nacos_chat_model import NacosChatModel
from agentscope.agent import ReActAgent
from agentscope.formatter import OpenAIChatFormatter
from agentscope.memory import InMemoryMemory


async def main():
	# 1. Configure Nacos connection
	client_config = (ClientConfigBuilder()
					 .server_address("localhost:8848")
					 .namespace_id("public")
					 .username("nacos")
					 .password("nacos")
					 .build())
	NacosServiceManager.set_global_config(client_config)

	# 2. Create Nacos-managed model
	model = NacosChatModel(
			model_key="my-model",  # Corresponds to DataId: my-model.json in Group: nacos-ai-model
			stream=True
	)

	# 3. Use in agent
	agent = ReActAgent(
			name="MyAgent",
			sys_prompt="You are an AI assistant",
			model=model,
			formatter=OpenAIChatFormatter(),
			memory=InMemoryMemory()
	)

	# 4. Use the agent
	from agentscope.message import Msg
	response = await agent(Msg(
			name="user",
			content="Hello",
			role="user"
	))
	print(response.content)

	# 5. Cleanup resources
	await NacosServiceManager.cleanup()


if __name__ == "__main__":
	asyncio.run(main())

3. Dynamic Model Configuration Updates

After modifying the model.json configuration in the Nacos console, the agent will automatically switch to the new model without restarting the application.


Scenario 2: Prompt Configuration Hosting

Host prompt templates in Nacos with support for variable rendering and hot updates.

1. Create Prompt Configuration in Nacos

Create the following configuration in the Nacos console:

Group: nacos-ai-prompt
DataId: {prompt_key}.json (e.g., my-assistant.json)
Format: JSON

{
  "template": "You are {{role}}, a helpful assistant specialized in {{domain}}. Please respond in {{language}}."
}

The template supports {{variable}} syntax for variable rendering.

2. Use in Code

import asyncio
import os
from v2.nacos import ClientConfigBuilder
from agentscope_extension_nacos.utils.nacos_service_manager import NacosServiceManager
from agentscope_extension_nacos.prompt.nacos_prompt_listener import NacosPromptListener
from agentscope.agent import ReActAgent
from agentscope.model import DashScopeChatModel
from agentscope.formatter import DashScopeChatFormatter
from agentscope.memory import InMemoryMemory


async def main():
    # 1. Configure Nacos connection
    client_config = (ClientConfigBuilder()
                     .server_address("localhost:8848")
                     .namespace_id("public")
                     .username("nacos")
                     .password("nacos")
                     .build())
    NacosServiceManager.set_global_config(client_config)

    # 2. Create Nacos prompt listener with template variables
    prompt_listener = NacosPromptListener(
        prompt_key="my-assistant",  # Corresponds to DataId: my-assistant.json
        args={
            "role": "Jarvis",
            "domain": "programming and technology",
            "language": "English",
        },
    )

    # 3. Create agent
    agent = ReActAgent(
        name="Jarvis",
        sys_prompt="",  # Will be set by NacosPromptListener
        model=DashScopeChatModel(
            model_name="qwen-max",
            api_key=os.getenv("DASH_SCOPE_API_KEY"),
        ),
        formatter=DashScopeChatFormatter(),
        memory=InMemoryMemory(),
    )

    # 4. Attach agent to prompt listener and initialize
    prompt_listener.attach_agent(agent)
    await prompt_listener.initialize()

    # Now the agent's sys_prompt is:
    # "You are Jarvis, a helpful assistant specialized in programming and technology. Please respond in English."

    # 5. Cleanup when done
    prompt_listener.detach_agent()
    await NacosServiceManager.cleanup()


if __name__ == "__main__":
    asyncio.run(main())

3. Dynamic Prompt Updates

After modifying the prompt template in the Nacos console, the agent's prompt will be automatically updated:

  • Variables will be re-rendered with the provided args
  • Agent's sys_prompt will be updated in real-time
  • No application restart needed

Scenario 3: MCP Tool Integration

Discover and use MCP tool servers from the Nacos MCP Registry.

1. Ensure MCP Server is Registered

MCP servers must be registered in the Nacos MCP Registry first. After registration, they can be used directly in code.

2. Use MCP Tools in Code

import asyncio
from v2.nacos import ClientConfigBuilder
from agentscope_extension_nacos.utils.nacos_service_manager import NacosServiceManager
from agentscope_extension_nacos.mcp.agentscope_nacos_mcp import (
	NacosHttpStatelessClient,
	NacosHttpStatefulClient
)
from agentscope_extension_nacos.mcp.agentscope_dynamic_toolkit import DynamicToolkit
from agentscope.agent import ReActAgent
from agentscope.model import OpenAIChatModel


async def main():
	# 1. Configure Nacos connection
	client_config = (ClientConfigBuilder()
					 .server_address("localhost:8848")
					 .namespace_id("public")
					 .username("nacos")
					 .password("nacos")
					 .build())
	NacosServiceManager.set_global_config(client_config)

	# 2. Create MCP clients
	# Stateless client (suitable for low-frequency calls)
	stateless_client = NacosHttpStatelessClient("weather-tools")

	# Stateful client (suitable for high-frequency calls)
	stateful_client = NacosHttpStatefulClient("calculator-tools")

	# 3. Create dynamic toolkit
	toolkit = DynamicToolkit()

	# 4. Register MCP clients
	await stateful_client.connect()
	await toolkit.register_mcp_client(stateless_client)
	await toolkit.register_mcp_client(stateful_client)

	# 5. Use toolkit in agent
	agent = ReActAgent(
			name="ToolAgent",
			sys_prompt="You are an AI assistant that can use tools",
			model=OpenAIChatModel(
					model_name="gpt-4",
					api_key="sk-xxx"
			),
			toolkit=toolkit
	)

	# Tools will automatically sync with Nacos configuration changes
	# No manual refresh needed

	# 6. Cleanup resources
	await stateful_client.close()
	await NacosServiceManager.cleanup()


if __name__ == "__main__":
	asyncio.run(main())

3. Dynamic Tool Updates

When MCP server tool configurations are updated in Nacos, DynamicToolkit will automatically sync the tool list, and the agent can immediately use the new tools.


📚 More Examples

Check the example/ directory for more complete examples:

⚙️ Advanced Configuration

NacosChatModel Backup Model

Configure a backup model that automatically falls back when the primary model fails:

from agentscope_extension_nacos.model.nacos_chat_model import NacosChatModel
from agentscope.model import OpenAIChatModel

# Create backup model
backup_model = OpenAIChatModel(
    model_name="gpt-3.5-turbo",
    api_key="sk-xxx"
)

# Create Nacos model (with backup)
model = NacosChatModel(
    agent_name="my-agent",
    nacos_client_config=None,
    stream=True,
    backup_model=backup_model  # Use backup model when primary fails
)

Custom Nacos Configuration

Use different Nacos configurations for different components:

from v2.nacos import ClientConfigBuilder

# Create independent configuration for specific components
custom_config = (ClientConfigBuilder()
    .server_address("another-nacos:8848")
    .namespace_id("test")
    .username("nacos")
    .password("nacos")
    .build())

# Use custom configuration
model = NacosChatModel(
    agent_name="my-agent",
    nacos_client_config=custom_config  # Use custom configuration
)

NacosPromptListener with Custom Args

Dynamically render prompt templates with custom variables:

from agentscope_extension_nacos.prompt.nacos_prompt_listener import NacosPromptListener

# Create prompt listener with template variables
prompt_listener = NacosPromptListener(
    prompt_key="customer-service",
    args={
        "company_name": "Acme Corp",
        "support_hours": "9 AM - 5 PM EST",
        "language": "English",
    },
)

❓ FAQ

Q: How to verify successful Nacos connection?

Check the log output for messages like:

INFO - [NacosServiceManager] Loaded Nacos config from env (basic auth): localhost:8848
INFO - [NacosServiceManager] NacosServiceManager initialized (singleton)

Or verify in code:

manager = NacosServiceManager()
assert manager.is_initialized()
Q: Configuration not updating after changes in Nacos?
  1. Check if Nacos configuration Group and DataId are correct
  2. Verify JSON configuration format is valid
  3. Check logs for error messages
  4. Confirm the listener is properly initialized
Q: MCP tools not available?
  1. Confirm MCP server is registered in Nacos MCP Registry
  2. Check if MCP server is running properly
  3. Verify network connectivity
  4. Check MCP client logs
Q: How to switch between different model providers?

Modify model.json configuration in Nacos:

{
  "modelProvider": "openai",  // or "anthropic", "ollama", "gemini", "dashscope"
  "modelName": "gpt-4",
  "apiKey": "sk-xxx"
}

The configuration will automatically take effect, and the agent will use the new model provider.

Q: What are the naming conventions for agent_name?

agent_name is used to identify configuration groups in Nacos, with the following conventions:

  • Only letters, numbers, ., :, _, - allowed
  • Maximum length 128 characters
  • Spaces automatically replaced with underscores
  • Configuration Group format: ai-agent-{agent_name}
Q: How does prompt variable rendering work?

NacosPromptListener uses {{variable}} syntax:

  • Variables in the template are replaced with values from the args dictionary
  • If a variable is not found in args, the original {{variable}} text is kept
  • Rendering happens on initial load and whenever the Nacos configuration changes

🤝 Community & Support

📄 License

This project is open-sourced under the Apache License 2.0.

🙏 Acknowledgments

Thanks to the following projects and communities for their support:

  • AgentScope - Powerful multi-agent framework
  • Nacos - Dynamic service discovery and configuration management platform
  • MCP Protocol - Model Context Protocol

If this project helps you, please give us a ⭐️ Star!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agentscope_extension_nacos-1.0.0.tar.gz (37.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agentscope_extension_nacos-1.0.0-py3-none-any.whl (34.4 kB view details)

Uploaded Python 3

File details

Details for the file agentscope_extension_nacos-1.0.0.tar.gz.

File metadata

File hashes

Hashes for agentscope_extension_nacos-1.0.0.tar.gz
Algorithm Hash digest
SHA256 b29cd93092f066f4d1cb765b3df55128224b986b6e1cd0cd388f3ac6bf351183
MD5 8ac4a3742255e434c2d28d69cdd74b51
BLAKE2b-256 caf4f1d17f9ce8b1a0a3e5e70202c0bcb844998562e28dca1abbd368670d10e3

See more details on using hashes here.

File details

Details for the file agentscope_extension_nacos-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for agentscope_extension_nacos-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e675821cb65944b6c7ba0afca8b599a5248b433873847b90a169c3124b99ecb7
MD5 5e30956e76e00469ca16a18c795db38f
BLAKE2b-256 9874c3ef1c74f1321b25171a7573c7b1016fb1dcaa013daf92c5f71210da08d2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page