Python SDK for Fencio HTTP Proxy - Agent observability and tracing
Project description
Fencio Proxy Client SDK
Python SDK for integrating with the Fencio HTTP proxy for agent observability and tracing.
Installation
pip install -e .
Or with optional dependencies:
# For requests library support
pip install -e ".[requests]"
# For httpx library support
pip install -e ".[httpx]"
# For all HTTP libraries
pip install -e ".[all]"
Quick Start
from fencio_proxy_client import FencioProxyClient
import requests
# Initialize the client
client = FencioProxyClient(
agent_id="my_threat_agent",
api_key="your_api_key_here",
proxy_url="http://localhost:8080"
)
# Start routing requests through Fencio proxy
client.start()
# All HTTP requests are now traced
response = requests.get("http://httpbin.org/get")
# Stop when done
client.stop()
Context Manager Support
from fencio_proxy_client import FencioProxyClient
import requests
# Use as a context manager
with FencioProxyClient(agent_id="my_agent", api_key="secret") as client:
# Requests inside this block are traced
response = requests.get("http://httpbin.org/get")
# Automatically stops when exiting the block
Layer Classification
You can classify requests by layer (e.g., LLM calls vs tool calls):
from fencio_proxy_client import FencioProxyClient
import requests
client = FencioProxyClient(agent_id="my_agent", api_key="secret")
client.start()
# Mark LLM requests
with client.layer_context('llm', {'model': 'gpt-4', 'provider': 'openai'}):
response = requests.post(
'https://api.openai.com/v1/chat/completions',
json={'model': 'gpt-4', 'messages': [{'role': 'user', 'content': 'Hello'}]}
)
# Mark tool requests
with client.layer_context('tool', {'tool': 'web_search'}):
response = requests.get('http://api.example.com/search?q=test')
client.stop()
Session Management
client = FencioProxyClient(
agent_id="my_agent",
api_key="secret",
session_id="custom_session_123" # Optional: specify your own session ID
)
# Or let the SDK generate a session ID automatically
client = FencioProxyClient(
agent_id="my_agent",
api_key="secret",
auto_session=True # Default: generates UUID session ID
)
# Generate a new session ID on the fly
new_session = client.new_session()
print(f"Started new session: {new_session}")
Supported HTTP Libraries
The SDK automatically patches these HTTP libraries:
- requests - Most popular HTTP library
- httpx - Modern async-capable HTTP client
- urllib3 - Low-level HTTP library
You can choose which libraries to patch:
# Only patch requests
client = FencioProxyClient(
agent_id="my_agent",
api_key="secret",
patch_libraries=['requests']
)
# Patch multiple libraries
client = FencioProxyClient(
agent_id="my_agent",
api_key="secret",
patch_libraries=['requests', 'httpx']
)
Integration with LangChain Agents
from langchain.agents import initialize_agent, Tool
from langchain.llms import OpenAI
from fencio_proxy_client import FencioProxyClient
# Initialize Fencio client
fencio = FencioProxyClient(
agent_id="langchain_threat_agent",
api_key="your_api_key"
)
fencio.start()
# Define tools
tools = [
Tool(
name="ThreatIntel",
func=lambda x: requests.get(f"http://threat-api.com/lookup?q={x}").json(),
description="Look up threat intelligence data"
)
]
# All LLM calls and tool calls are now traced through Fencio
llm = OpenAI(temperature=0)
agent = initialize_agent(tools, llm, agent="zero-shot-react-description")
result = agent.run("What is the threat level for IP 192.168.1.1?")
fencio.stop()
Configuration
Environment Variables
You can also configure the client using environment variables:
export FENCIO_AGENT_ID="my_agent"
export FENCIO_API_KEY="secret_key"
export FENCIO_PROXY_URL="http://localhost:8080"
import os
from fencio_proxy_client import FencioProxyClient
client = FencioProxyClient(
agent_id=os.getenv('FENCIO_AGENT_ID'),
api_key=os.getenv('FENCIO_API_KEY'),
proxy_url=os.getenv('FENCIO_PROXY_URL', 'http://localhost:8080')
)
API Reference
FencioProxyClient
__init__(agent_id, api_key=None, proxy_url="http://localhost:8080", session_id=None, auto_session=True, patch_libraries=None)
Initialize the Fencio proxy client.
Parameters:
agent_id(str): Unique identifier for this agentapi_key(str, optional): API key for authenticationproxy_url(str): URL of the Fencio proxy server (default: "http://localhost:8080")session_id(str, optional): Session identifier (auto-generated if not provided)auto_session(bool): Auto-generate session ID if not provided (default: True)patch_libraries(list, optional): List of HTTP libraries to patch (default: all available)
start()
Start routing HTTP requests through Fencio proxy.
stop()
Stop routing HTTP requests through Fencio proxy.
is_active() -> bool
Check if the client is currently active.
get_headers(layer=None, layer_metadata=None) -> dict
Get Fencio headers to inject into HTTP requests.
Parameters:
layer(str, optional): Layer classification (e.g., 'llm', 'tool')layer_metadata(dict, optional): Additional metadata for the layer
Returns:
- Dictionary of HTTP headers
new_session() -> str
Generate a new session ID and update the client.
Returns:
- The new session ID
layer_context(layer, metadata=None)
Context manager for setting layer classification for a block of code.
Parameters:
layer(str): Layer classification (e.g., 'llm', 'tool')metadata(dict, optional): Additional metadata for the layer
How It Works
-
Library Patching: When you call
client.start(), the SDK patches common HTTP libraries (requests, httpx, urllib3) to route all HTTP traffic through the Fencio proxy. -
Header Injection: For each HTTP request, the SDK automatically injects these headers:
X-Fencio-Agent-ID: Your agent identifierX-Fencio-API-Key: Your API key (if provided)X-Fencio-Session-ID: Session identifierX-Fencio-Layer: Layer classification (if usinglayer_context)X-Fencio-Layer-Metadata: JSON-encoded layer metadata
-
Proxy Routing: All HTTP/HTTPS requests are automatically routed through the Fencio proxy at the specified
proxy_url. -
Trace Storage: The Fencio proxy captures complete request/response data and stores it with agent context for later analysis.
License
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fencio_proxy_client-0.1.0.tar.gz.
File metadata
- Download URL: fencio_proxy_client-0.1.0.tar.gz
- Upload date:
- Size: 11.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0f83405c23ef7a30d61631409cebf4fd5c56d4b7369c8d67afec43602caacdad
|
|
| MD5 |
a436248da34a51df10f0b54d19cbc93e
|
|
| BLAKE2b-256 |
6f99476d1512088b0e8b205b2544e0f5eb081a7f0615ad10eeeaf38270a11297
|
Provenance
The following attestation bundles were made for fencio_proxy_client-0.1.0.tar.gz:
Publisher:
publish-proxy-client.yml on fencio-dev/proxy
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
fencio_proxy_client-0.1.0.tar.gz -
Subject digest:
0f83405c23ef7a30d61631409cebf4fd5c56d4b7369c8d67afec43602caacdad - Sigstore transparency entry: 1078582688
- Sigstore integration time:
-
Permalink:
fencio-dev/proxy@3cfa4f89dfc41caf8a54ffbc3e12ebfdfd66ee6a -
Branch / Tag:
refs/tags/proxy-client/v0.1.0 - Owner: https://github.com/fencio-dev
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-proxy-client.yml@3cfa4f89dfc41caf8a54ffbc3e12ebfdfd66ee6a -
Trigger Event:
push
-
Statement type:
File details
Details for the file fencio_proxy_client-0.1.0-py3-none-any.whl.
File metadata
- Download URL: fencio_proxy_client-0.1.0-py3-none-any.whl
- Upload date:
- Size: 10.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a987ebf3055ffd0b53ded8cce5cfd13e7894a93abd8735cd0bc98d09e1c5e830
|
|
| MD5 |
7bbe8b3c6eca6f39da71960eefb10d20
|
|
| BLAKE2b-256 |
05876488b373279efd719a808ae335eff8f91bc95a2dd48159fc6123c0c8e88e
|
Provenance
The following attestation bundles were made for fencio_proxy_client-0.1.0-py3-none-any.whl:
Publisher:
publish-proxy-client.yml on fencio-dev/proxy
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
fencio_proxy_client-0.1.0-py3-none-any.whl -
Subject digest:
a987ebf3055ffd0b53ded8cce5cfd13e7894a93abd8735cd0bc98d09e1c5e830 - Sigstore transparency entry: 1078582702
- Sigstore integration time:
-
Permalink:
fencio-dev/proxy@3cfa4f89dfc41caf8a54ffbc3e12ebfdfd66ee6a -
Branch / Tag:
refs/tags/proxy-client/v0.1.0 - Owner: https://github.com/fencio-dev
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-proxy-client.yml@3cfa4f89dfc41caf8a54ffbc3e12ebfdfd66ee6a -
Trigger Event:
push
-
Statement type: