Skip to main content

No project description provided

Project description

Promptflow Tool Semantic Kernel

A Python package that integrates Semantic Kernel with Azure Prompt Flow, enabling efficient LLM application development.

Quality Gate Status Coverage Lines of Code Maintainability Rating Duplicated Lines (%)

Why?

This tool bridges the powerful execution flow of Promptflow with the advanced ReAct capabilities of Semantic Kernel, offering several advantages:

  • Easily pre-process or post-process data from your main assistant with minimal configuration.
  • Leverage Semantic Kernel's planning and reasoning capabilities within your Prompt Flow applications by just providing configuration.
  • Connect to a variety of LLM providers beyond OpenAI, including Anthropic Claude, Amazon Bedrock, Llama, and more through Semantic Kernel's connectors.
  • Access Semantic Kernel's growing plugin ecosystem to extend functionality without writing custom code.
  • Use Promptflow's UI and batch evaluation with your semantic kernel assistant.

The integration creates a best-of-both-worlds solution, combining Promptflow's orchestration capabilities with Semantic Kernel's flexibility and plugin architecture.

Installation

Install the package from PyPI:

pip install promptflow-tool-semantic-kernel

You can find the package on PyPI.

Usage

In VSCode Promptflow

Once installed, the Semantic Kernel tool will be available in your Promptflow tools collection:

New Tool in Sidebar

  1. Create a new promptflow in VSCode
  2. Add a custom LLM tool node
  3. Select "Semantic Kernel LLM Tool" from the tool list
  4. Configure the following parameters:
    • Connection (Azure OpenAI or OpenAI)
    • Deployment name (model name for OpenAI or deployment name for Azure)
    • Chat history (optional)
    • Plugins (optional)
    • Customize your prompt as needed

Semantic Kernel Chat

Running the Demo

The package includes a simple demo script:

# Set up environment variables
export AZURE_OPENAI_API_KEY=your_api_key
export AZURE_OPENAI_ENDPOINT=your_endpoint
export AZURE_OPENAI_DEPLOYMENT_NAME=your_deployment_name

# Run the demo
python -m scripts.main

Using different connections

Google Gemini

Add a CustomConnection via promptflow in vscode as follows, important is that api_type: "google":

$schema: https://azuremlschemas.azureedge.net/promptflow/latest/CustomConnection.schema.json
name: "google_gemini"
type: custom
configs:
  api_type: "google"
  model_id: "gemini-2.0-flash"
secrets:
  # Use'<no-change>' to keep original value or '<user-input>' to update it when the application runs.
  api_key: "<user-input>"

Adding Custom Plugins

Semantic Kernel allows you to easily extend functionality through plugins. Learn more about creating a native plugin.

Here's how to use plugins with this tool:

Built-in Plugins

The tool comes with a built-in LightsPlugin for demonstration:

# Default plugin configuration
plugins = [
     {
          "name": "lights",
          "class": "LightsPlugin",
          "module": "promptflow_tool_semantic_kernel.tools.lights_plugin"
     }
]

Configuring with flow.dag.yaml

You can also configure the tool using a flow.dag.yaml file. This file defines the flow and its components, including the semantic_kernel_chat tool and its plugins. Here is an example configuration:

# filepath: /workspaces/promptflow-tool-semantic-kernel/tests/system/flow.dag.yaml
$schema: https://azuremlschemas.azureedge.net/promptflow/latest/Flow.schema.json
environment:
     python_requirements_txt: requirements.txt
environment_variables:
     PROMPTFLOW_SERVING_ENGINE: fastapi
     PF_DISABLE_TRACING: "false"
inputs:
     chat_history:
          type: list
          is_chat_history: true
          default: []
     question:
          type: string
          is_chat_input: true
outputs:
     answer:
          type: string
          reference: ${semantic_kernel_chat.output}
          is_chat_output: true
nodes:
- name: semantic_kernel_chat
     type: custom_llm
     source:
          type: package_with_prompt
          tool: promptflow_tool_semantic_kernel.tools.semantic_kernel_tool.semantic_kernel_chat
          path: semantic_kernel_chat.jinja2
     inputs:
          connection: open_ai_connection
          deployment_name: gpt-4
          chat_history: ${inputs.chat_history}
          question: ${inputs.question}
          plugins: |
               [
               {
                    "name": "lights",
                    "class": "LightsPlugin",
                    "module": "promptflow_tool_semantic_kernel.tools.lights_plugin"
               }
               ]

This configuration allows you to leverage the power of plugins within your flow. You can define multiple plugins to extend the functionality of the semantic_kernel_chat tool. Each plugin is specified with its name, class, and module, making it easy to integrate and customize as needed.

Development

Setup

  1. Clone the repository
    git clone git@github.com:FabianSchurig/promptflow-tool-semantic-kernel.git
    cd promptflow-tool-semantic-kernel
    cp .devcontainer/devcontainer.env.example .devcontainer/devcontainer.env
    
  2. Start the devcontainer with vs code
  3. Install development dependencies (should automatically run):
    poetry install
    
  4. Activate the environment
    eval $(poetry env activate)
    which python
    uvicorn tests.system.api:app --workers 1 --port 5000
    

Testing

Run the tests with pytest:

poetry run pytest
poetry run pytest --cov-report xml:coverage.xml --cov-report term --cov=promptflow_tool_semantic_kernel --cov-config=.coveragerc tests/

License

This project is licensed under the GNU Affero General Public License v3.0 (AGPL-3.0) - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

promptflow_tool_semantic_kernel-0.2.2.tar.gz (28.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

promptflow_tool_semantic_kernel-0.2.2-cp312-cp312-manylinux_2_36_x86_64.whl (28.5 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.36+ x86-64

File details

Details for the file promptflow_tool_semantic_kernel-0.2.2.tar.gz.

File metadata

File hashes

Hashes for promptflow_tool_semantic_kernel-0.2.2.tar.gz
Algorithm Hash digest
SHA256 10e6e16fb0dd1674f70d56260a834b9f240a18c39e80c09bda3a653b15f9138d
MD5 adc104607dd90365708bb4d2fcefb1dd
BLAKE2b-256 3f682d5c3c0fcb14294c7b3e8cc562afb76f4ad3a02eba4c8b6ea64554d206f2

See more details on using hashes here.

File details

Details for the file promptflow_tool_semantic_kernel-0.2.2-cp312-cp312-manylinux_2_36_x86_64.whl.

File metadata

File hashes

Hashes for promptflow_tool_semantic_kernel-0.2.2-cp312-cp312-manylinux_2_36_x86_64.whl
Algorithm Hash digest
SHA256 a97bb3ed99dcd75e4ddf969ecd03d51a47f4bc7abaf37cbe2e22220d64cb939b
MD5 050da52f23d05643661bfd5d71b806bb
BLAKE2b-256 00ab40afc3e38fea53845caf4ea367caa25f2d70b64e2e40d6f713e31e57e6ed

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page