Skip to main content

Protecting GenAI from Prompt Injection

Project description

Panoptica GenAI Protection SDK

A simple python client SDK for integration with Panoptica GenAI Protection.


GenAI Protection is part of Panoptica, a cloud native application protection platform (CNAPP), and provides protection for LLM-backed systems. Specifically, the GenAI Protection SDK inspects both input and output prompts, flagging those it identifies as likely containing malicious content with a high degree of certainty.

The Python SDK is provided to programmatically integrate your system with our LLM protection software, enabling you to verify the safety level of processing a user requested prompt before actually processing it. Following this evaluation, the application can then determine the appropriate subsequent steps based on your policy.

Installation

pip install panoptica_genai_protection

Usage Example

Working assumptions:

  • You have generated a key-pair for GenAI Protection in the Panoptica settings screen
    • The access key is set in the GENAI_PROTECTION_ACCESS_KEY environment variable
    • The secret key is set in the GENAI_PROTECTION_SECRET_KEY environment variable
  • We denote the call to generating the LLM response as get_llm_response()

GenAIProtectionClient provides the check_llm_prompt method to determine the safety level of a given prompt.

Sample Snippet

from panoptica_genai_protection.client import GenAIProtectionClient
from panoptica_genai_protection.gen.models import Result as InspectionResult

# ... Other code in your module ...

# initialize the client
genai_protection_client = GenAIProtectionClient()

# Send the prompt for inspection BEFORE sending it to the LLM
inspection_result = genai_protection_client.check_llm_prompt(
  chat_request.prompt,
  api_name="chat_service",  # Name of the service running the LLM
  api_endpoint_name="/chat",  # Name of the endpoint serving the LLM interaction
  sequence_id=chat_id,  # UUID of the chat, if you don't have one, provide `None`
  actor="John Doe",  # Name of the "actor" interacting with the LLM service.
  actor_type="user",  # Actor type, one of {"user", "ip", "bot"}
)

if inspection_result.result == InspectionResult.safe:
  # Prompt is safe, generate an LLM response
  llm_response = get_llm_response(
    chat_request.prompt
  )

  # Call GenAI protection on LLM response (completion)
  inspection_result = genai_protection_client.check_llm_response(
    prompt=chat_request.prompt,
    response=llm_response,
    api_name="chat_service",
    api_endpoint_name="/chat",
    actor="John Doe",
    actor_type="user",
    request_id=inspection_result.reqId,
    sequence_id=chat_id,
  )
  if inspection_result.result != InspectionResult.safe:
    # LLM answer is flagged as unsafe, return a predefined error message to the user
    answer_response = "Something went wrong."
else:
  # Prompt is flagged as unsafe, return a predefined error message to the user
  answer_response = "Something went wrong."

Async use:

You may use the client in async context in two ways:

async def my_async_call_to_gen_ai_protection(prompt: str):
    client = GenAIProtectionClient(as_async=True)
    return await client.check_llm_prompt_async(
        prompt=prompt,
        api_name="test",
        api_endpoint_name="/test",
        actor="John Doe",
        actor_type="user"
    )

or

async def my_other_async_call_to_gen_ai_protection(prompt: str):
    async with GenAIProtectionClient() as client:
      return await client.check_llm_prompt_async(
          prompt=prompt,
          api_name="test",
          api_endpoint_name="/test",
          actor="John Doe",
          actor_type="user"
      )

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

panoptica_genai_protection-0.1.12.tar.gz (3.9 kB view hashes)

Uploaded Source

Built Distribution

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page