Skip to main content

An interactive Python code interpreter designed to collaborate with OpenAI and Azure AI models.

Project description

PyCodeI (Python Code Interpreter)

This project provides a Python code interpreter that interacts with large language models trained by OpenAI. This interpreter is designed to run Python code in a notebook environment, enabling data analysis, visualization, and prediction. The Python code runs interactively, but in the notebook from the beginning each time.

Features

  • Data Analysis and Visualization: Analyze datasets and create appropriate graphs for visualization.
  • Trend Extraction and Future Projections: Extract trends from data and provide future projections.
  • Wide Range of Scientific Topics: Provide information on various scientific topics, including NLP, machine learning, mathematics, physics, chemistry, and biology.
  • Secure Handling of User Files: Store and persist user files in the /mnt/data drive.

Installation

Quick start (PyPI)

pip install pycodei

This installs the published package and exposes the pycodei CLI on your PATH.

Local development workflow

  1. Clone the repository:

    git clone https://github.com/KentaroAOKI/python_code_interpreter.git
    cd python_code_interpreter
    
  2. Install in editable mode (also registers the CLI):

    pip install -e .
    
  3. Create your user configuration file:

    mkdir -p ~/.pycodei
    cat > ~/.pycodei/config.json <<'EOF'
    {
      "DEPLOYMENT_NAME": "gpt-4o-mini",
      "PYCODEI_CLIENT": "azure",
      "AZURE_OPENAI_API_KEY": "<your azure openai api key>",
      "AZURE_OPENAI_ENDPOINT": "https://<your endpoint>.openai.azure.com/",
      "OPENAI_API_VERSION": "2024-10-01-preview",
      "OPENAI_API_KEY": "<your openai api key>"
    }
    EOF
    

    Running pycodei --help will also create this file with placeholder values if it does not exist.

Usage

To use the Python Code Interpreter, run the following command (replace the prompt text with your task):

pycodei "Retrieve NVIDIA stock prices from Jan-Mar 2023 and predict April onward."

Use pycodei --help to see optional flags such as --deployment-name, which overrides the value in ~/.pycodei/config.json.

Example

You can also start without an inline prompt; the CLI will ask for one interactively:

# Start CLI and respond to the input prompt
pycodei

A notebook file with the execution date and time is created in the results directory. You can view it with Visual Studio Code's Jupyter extension.

  • results/result_20250113-082727.ipynb
    Notebook where data analysis steps were recorded.
  • results/result_20250113-082727.json
    Log of messages sent to API.

Example 1

"Retrieve NVIDIA stock prices from Yahoo for January to March 2023 and predict prices from April 2023 onwards."
Result: sample_01.ipynb

Example 2

"Determine whether the data in /mnt/data/diagnosis.csv is malignant or benign. To make a decision, use the model learned using the load_breast_cancer data available from scikit-learn."
Result: sample_02.ipynb

Example 3

"2023年の1月から3月のNVIDIA株価をYahooから取得して、2023年4月以降の株価を予測してください。"
Result: sample_03.ipynb

Example 4

"/mnt/data/diagnosis.csv のデータが悪性か良性か判断してください。判断は、scikit-learn から取得できる load_breast_cancer データで学習したモデルを使ってください。日本語で説明してください。"
Result: sample_04.ipynb

Configuration

Runtime credentials are read from ~/.pycodei/config.json. Example:

{
  "DEPLOYMENT_NAME": "gpt-4o-mini",
  "PYCODEI_CLIENT": "azure",
  "AZURE_OPENAI_API_KEY": "<your azure openai api key>",
  "AZURE_OPENAI_ENDPOINT": "https://<your endpoint>.openai.azure.com/",
  "OPENAI_API_VERSION": "2024-10-01-preview",
  "OPENAI_API_KEY": "<your openai api key>",
  "mcpServers": {
    "filesystem": {
      "disabled": true,
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/path/to/share"
      ],
      "env": {
        "FS_ROOT": "/path/to/share"
      }
    }
  }
}

Set "PYCODEI_CLIENT" to azure (default) or openai to choose which SDK client the interpreter instantiates; you can temporarily override it with the PYCODEI_CLIENT environment variable. All other keys map directly to the environment variables expected by the OpenAI/Azure SDKs. Leaving a value blank may cause authentication failures, so be sure to populate the entries relevant to your deployment.

Optional: PYCODEI.md

If you keep a PYCODEI.md file, its contents are appended to the system prompt every time pycodei starts. Place the file in one of these locations (checked in order): ~/.pycodei/PYCODEI.md, the directory where the package is installed (alongside python_code_interpreter.py), or your current working directory. Use it for persistent guardrails, safety rules, or project-specific requirements.

Optional: MCP servers

pycodei can now connect to multiple Model Context Protocol servers using the same schema. Add them under the "mcpServers" key in ~/.pycodei/config.json and set "disabled": false for the entries you want to enable. Example:

"mcpServers": {
  "filesystem": {
    "disabled": false,
    "command": "npx",
    "args": [
      "-y",
      "@modelcontextprotocol/server-filesystem",
      "/Users/me/workspace"
    ],
    "env": {
      "FS_ROOT": "/Users/me/workspace"
    }
  },
  "custom_tool": {
    "command": "/usr/local/bin/python",
    "args": [
      "/path/to/server.py"
    ],
    "cwd": "/path/to",
    "transport": "stdio"
  }
}

When pycodei starts it will spin up each enabled server, discover its MCP tools, and register them as callable functions. Tool names are derived from server_name__tool_name and become available to the model alongside the built-in run_python tool. SSE (transport: "sse") and WebSocket (transport: "websocket") endpoints are also supported if you prefer remote servers—specify a url instead of a command.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pycodei-0.1.5.tar.gz (72.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pycodei-0.1.5-py3-none-any.whl (84.5 kB view details)

Uploaded Python 3

File details

Details for the file pycodei-0.1.5.tar.gz.

File metadata

  • Download URL: pycodei-0.1.5.tar.gz
  • Upload date:
  • Size: 72.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for pycodei-0.1.5.tar.gz
Algorithm Hash digest
SHA256 4630d9d5ceb9d1b282884b06a04303d96314e2277eed48cede81423bd2e4c93c
MD5 1579711a87081d119b606f6808f8ca9c
BLAKE2b-256 258d68817da55eeaf252dc9114cfa6eae049115483b327e906fc0529f11e3cd7

See more details on using hashes here.

File details

Details for the file pycodei-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: pycodei-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 84.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for pycodei-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 f15333cc796cbf48e9fc87d1019532efebcd1f23532833fe0a6e64428ab1011c
MD5 881e581678e11de4dcd92a81d83f9e8b
BLAKE2b-256 a7586c63924dbc40d67a4c4af92acb781d41c451f08e07760a065f90a4b39972

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page