Lightweight and portable LLM sandbox runtime (code interpreter) Python library
Project description
LLM Sandbox
The easiest way to run large language model (LLM) generated code (code interpreter) in a safe and isolated environment.
LLM Sandbox is a lightweight and portable sandbox environment designed to run large language model (LLM) generated code in a safe and isolated manner using Docker containers. This project aims to provide an easy-to-use interface for setting up, managing, and executing code in a controlled Docker environment, simplifying the process of running code generated by LLMs.
Features
- Easy Setup: Quickly create sandbox environments with minimal configuration.
- Isolation: Run your code in isolated Docker containers to prevent interference with your host system.
- Flexibility: Support for multiple programming languages.
- Portability: Use predefined Docker images or custom Dockerfiles.
- Scalability: Support Kubernetes and remote Docker host.
Installation
Using Poetry
- Ensure you have Poetry installed.
- Add the package to your project:
poetry add llm-sandbox
Using pip
- Ensure you have pip installed.
- Install the package:
pip install llm-sandbox
Usage
Session Lifecycle
The SandboxSession
class manages the lifecycle of the sandbox environment, including the creation and destruction of Docker containers. Here’s a typical lifecycle:
- Initialization: Create a
SandboxSession
object with the desired configuration. - Open Session: Call the
open()
method to build/pull the Docker image and start the Docker container. - Run Code: Use the
run()
method to execute code inside the sandbox. Currently, it supports Python, Java, JavaScript, C++, Go, and Ruby. See examples for more details. - Close Session: Call the
close()
method to stop and remove the Docker container. If thekeep_template
flag is set toTrue
, the Docker image will not be removed, and the last container state will be committed to the image.
Example
Here's a simple example to demonstrate how to use LLM Sandbox:
from llm_sandbox import SandboxSession
# Create a new sandbox session
with SandboxSession(image="python:3.9.19-bullseye", keep_template=True, lang="python") as session:
result = session.run("print('Hello, World!')")
print(result)
# With custom Dockerfile
with SandboxSession(dockerfile="Dockerfile", keep_template=True, lang="python") as session:
result = session.run("print('Hello, World!')")
print(result)
# Or default image
with SandboxSession(lang="python", keep_template=True) as session:
result = session.run("print('Hello, World!')")
print(result)
LLM Sandbox also supports copying files between the host and the sandbox:
from llm_sandbox import SandboxSession
with SandboxSession(lang="python", keep_template=True) as session:
# Copy a file from the host to the sandbox
session.copy_to_runtime("test.py", "/sandbox/test.py")
# Run the copied Python code in the sandbox
result = session.run("python /sandbox/test.py")
print(result)
# Copy a file from the sandbox to the host
session.copy_from_runtime("/sandbox/output.txt", "output.txt")
For other languages usage, please refer to the examples.
You can also use remote Docker host as below:
import docker
from llm_sandbox import SandboxSession
tls_config = docker.tls.TLSConfig(
client_cert=("path/to/cert.pem", "path/to/key.pem"),
ca_cert="path/to/ca.pem",
verify=True
)
docker_client = docker.DockerClient(base_url="tcp://<your_host>:<port>", tls=tls_config)
with SandboxSession(
client=docker_client,
mage="python:3.9.19-bullseye",
keep_template=True,
lang="python",
) as session:
result = session.run("print('Hello, World!')")
print(result)
For Kubernetes usage, please refer to the examples. Essentially, you just need to set the use_kubernetes flag to True and provide the Kubernetes client, or leave it as the default for the local context.
API Reference
SandboxSession
Initialization
SandboxSession(
image: Optional[str] = None,
dockerfile: Optional[str] = None,
lang: str = SupportedLanguage.PYTHON,
keep_template: bool = False,
verbose: bool = True
)
image
: Docker image to use.dockerfile
: Path to the Dockerfile, if an image is not provided.lang
: Language of the code (default:SupportedLanguage.PYTHON
).keep_template
: IfTrue
, the image and container will not be removed after the session ends.verbose
: IfTrue
, print messages.
Methods
open()
: Start the Docker container.close()
: Stop and remove the Docker container.run(code: str, libraries: Optional[List] = None)
: Execute code inside the sandbox.copy_from_runtime(src: str, dest: str)
: Copy a file from the sandbox to the host.copy_to_runtime(src: str, dest: str)
: Copy a file from the host to the sandbox.execute_command(command: str)
: Execute a command inside the sandbox.
Contributing
We welcome contributions to improve LLM Sandbox! Since I am a Python developer, I am not familiar with other languages. If you are interested in adding better support for other languages, please feel free to submit a pull request.
Here is a list of things you can do to contribute:
- Add Java maven support.
- Add support for JavaScript.
- Add support for C++.
- Add support for Go.
- Add support for Ruby.
- Add remote Docker host support.
- Add remote Kubernetes cluster support.
- Commit the last container state to the image before closing kubernetes session.
- Release version 1.0.0.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for llm_sandbox-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8a7045506fa0bf4a8e84d295ff1a4d0adc324d26d69ce6f82b4ca02428cff7ef |
|
MD5 | 0891bec86fcf0b8d5ab8d1dcddb17f31 |
|
BLAKE2b-256 | 35866f0a7b999d2b1ab074e64efd818157c95b52cfb9e83040cccb8f33b8add7 |