A Python library for function calling in LLMs
Project description
MFCS (Model Function Calling Standard)
Model Function Calling Standard
A Python library for handling function calling in Large Language Models (LLMs).
Features
- Generate function calling prompt templates
- Parse function calls from LLM streaming output
- Validate function schemas
- Async streaming support
- API result management
- Multiple function call handling
Installation
pip install mfcs
Configuration
- Copy
.env.exampleto.env:
cp .env.example .env
- Edit
.envand set your environment variables:
# OpenAI API Configuration
OPENAI_API_KEY=your-api-key-here
OPENAI_API_BASE=your-api-base-url-here
Example Installation
To run the example code, you need to install additional dependencies. The examples are located in the examples directory, and each example has its specific dependency requirements:
cd examples
pip install -r requirements.txt
Usage
1. Generate Function Calling Prompt Templates
from mfcs.function_calling.function_prompt import FunctionPromptGenerator
# Define your function schemas
functions = [
{
"name": "get_weather",
"description": "Get the current weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The unit of temperature to use",
"default": "celsius"
}
},
"required": ["location"]
}
}
]
# Generate prompt template
template = FunctionPromptGenerator.generate_function_prompt(functions)
2. Parse Function Calls from Output
from mfcs.function_calling.response_parser import ResponseParser
# Example function call
output = """
I need to check the weather.
<mfcs_call>
<instructions>Getting weather information for New York</instructions>
<call_id>weather_1</call_id>
<name>get_weather</name>
<parameters>
{
"location": "New York, NY",
"unit": "fahrenheit"
}
</parameters>
</mfcs_call>
"""
# Parse the function call
parser = ResponseParser()
content, tool_calls = parser.parse_output(output)
print(f"Content: {content}")
print(f"Function calls: {tool_calls}")
3. Async Streaming Processing and Function Calling
from mfcs.function_calling.response_parser import ResponseParser
from mfcs.function_calling.api_result_manager import ApiResultManager
async def process_stream():
parser = ResponseParser()
api_results = ApiResultManager()
async for chunk in stream:
content, tool_calls = parser.parse_stream_output(chunk)
if content:
print(content, end="", flush=True)
if tool_calls:
for tool_call in tool_calls:
# Process function call and store results
result = await process_function_call(tool_call)
api_results.add_api_result(tool_call['call_id'], tool_call['name'], result)
# Get all processing results
return api_results.get_api_results()
Examples
Check out the examples directory for more detailed examples:
-
function_calling_examples.py: Basic function calling examples- Function prompt generation
- Function call parsing
- API result management
-
async_function_calling_examples.py: Async streaming examples- Async streaming best practices
- Concurrent function call handling
- Async error handling and timeout control
-
mcp_client_example.py: MCP client integration examples- Basic MCP client setup
- Function registration
- Tool calling implementation
-
async_mcp_client_example.py: Async MCP client examples- Async MCP client configuration
- Async tool calling implementation
- Concurrent task processing
Run the examples to see the library in action:
# Run basic examples
python examples/function_calling_examples.py
python examples/mcp_client_example.py
# Run async examples
python examples/async_function_calling_examples.py
python examples/async_mcp_client_example.py
Notes
- The library requires Python 3.8+ for async features
- Make sure to handle API keys and sensitive information securely
- For production use, replace simulated API calls with actual implementations
- Follow the tool calling rules in the prompt template
- Use unique call_ids for each function call
- Provide clear instructions for each function call
- Handle errors and resource cleanup in async streaming processing
- Use
ApiResultManagerto manage results from multiple function calls - Handle exceptions and timeouts properly in async context
System Requirements
- Python 3.8 or higher
License
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mfcs-0.1.1.tar.gz.
File metadata
- Download URL: mfcs-0.1.1.tar.gz
- Upload date:
- Size: 23.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3b97060fa7bf1d50ce48d0336fe5b434e57a58a74a94085716ec09d7e042a237
|
|
| MD5 |
ede0deaccf0bd9b6cf6aac57d4c320aa
|
|
| BLAKE2b-256 |
3cf86aaf86d152e224dbd7cf095a703ccd5b08985a4a6f3afa83466a1689504d
|
File details
Details for the file mfcs-0.1.1-py3-none-any.whl.
File metadata
- Download URL: mfcs-0.1.1-py3-none-any.whl
- Upload date:
- Size: 9.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
40fe4d7b2d552ee892e8eeba95eafed11b2b45168645dc28a26e6c21aa6a2f13
|
|
| MD5 |
f1d10463d5e21f10e5b742e433fb2a7a
|
|
| BLAKE2b-256 |
c53f0525bf68ef2db7ba0c43fe49b69ce6458c7977a7f0dd2192695e49f6f04d
|