A module for python code execution locally, in separate venv or even separate sandbox container.
Project description
code-executor-py
A powerful and flexible Python module for secure code execution in isolated environments. Execute Python code safely in virtual environments locally or in remote sandbox containers.
Features
-
VenvExecutor: Execute code in isolated virtual environments
- Automatic dependency detection and installation
- Smart package name resolution
- Handles import errors intelligently
-
RemoteExecutor: Run code in remote sandbox containers
- Client-server architecture
- Secure data serialization
- Network-isolated execution
-
Smart Package Management
- Automatically resolves import statements to package names
- Maps common aliases to correct package names (e.g., 'sklearn' → 'scikit-learn')
- Dynamically installs missing dependencies
Installation
pip install code-executor-py
Quick Start
Local Execution with VenvExecutor
import pandas as pd
from code_executor_py import VenvExecutor
# Define your function
func_code = """
from sklearn.preprocessing import StandardScaler
import pandas as pd
def process_data(data_df):
scaler = StandardScaler()
return pd.DataFrame(scaler.fit_transform(data_df), columns=data_df.columns)
"""
# Create executor and compile function
executor = VenvExecutor()
process_data = executor.create_executable(func_code)
# Execute function with data
test_data = pd.DataFrame({'A': [1, 2, 3], 'B': [4, 5, 6]})
result = process_data(data_df=test_data)
print(result)
Remote Execution (Server)
from code_executor_py import RemoteExecutorServer
# Start a server on port 8099
server = RemoteExecutorServer(host="0.0.0.0", port=8099)
server.run()
Remote Execution (Client)
from code_executor_py import RemoteExecutor
# Connect to the remote server
executor = RemoteExecutor("http://localhost:8099")
# Define your function
func_code = """
def add_numbers(a: int, b: int) -> int:
return a + b
"""
# Create executable function
add_numbers = executor.create_executable(func_code)
# Execute remotely
result = add_numbers(5, 3)
print(result) # Output: 8
Advanced Usage
Using with LLMs to Auto-Install Dependencies
from code_executor_py import VenvExecutor
from langchain_openai import ChatOpenAI
# Create executor with LLM for smart package resolution
executor = VenvExecutor(
llm=ChatOpenAI(temperature=0),
debug_mode=True
)
# The executor can now ask the LLM for the correct package name
# when encountering unknown imports
Custom Base Packages
from code_executor_py import VenvExecutor
# Create executor with custom base packages
executor = VenvExecutor(
venv_path="./custom_venv",
base_packages=["numpy", "pandas", "matplotlib", "scipy"]
)
How It Works
- Code Analysis: Your function code is analyzed to extract imports
- Environment Preparation: Dependencies are automatically installed in isolated environments
- Secure Execution: Code runs in a separate process with proper error handling
- Result Serialization: Results are properly serialized and returned to your main program
Security Benefits
- Isolation: Code executes in separate environments, protecting your main application
- Dependency Management: Avoid conflicts with your application's dependencies
- Resource Control: Limit the resources available to executed code (especially with remote execution)
- Network Isolation: Remote execution provides complete network isolation
Use Cases
- AI/ML Pipelines: Safely execute generated or user-provided code
- Data Processing: Run data transformation scripts in isolation
- Teaching/Education: Create safe execution environments for student code
- Microservices: Execute code remotely for resource-intensive operations
License
MIT
Contributing
Contributions are welcome! Please feel free to submit a Pull Request or open an Issue.
Author
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file code_executor_py-0.2.3.tar.gz.
File metadata
- Download URL: code_executor_py-0.2.3.tar.gz
- Upload date:
- Size: 8.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
dd10a55c915deef30837822c9c2aee9d1bda13e97bf871dfeda37c84ae826311
|
|
| MD5 |
020f0b3d60cd3fb09efa46ca5aa56a1b
|
|
| BLAKE2b-256 |
b1ca81b7e4516de8957ed32c43e4c24313483c74c49629884b97edf0533f37b4
|
File details
Details for the file code_executor_py-0.2.3-py3-none-any.whl.
File metadata
- Download URL: code_executor_py-0.2.3-py3-none-any.whl
- Upload date:
- Size: 9.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3b9e395ec741a26a3e9ac292c72ea46ab41fc5747044aa8be2144126099ba05c
|
|
| MD5 |
2c406503d29859135dfa9d7424ec4f02
|
|
| BLAKE2b-256 |
aaa8a56360a276fd7ae28c725f64c87752ea359f9ebe88d15453109ab1b62932
|