Skip to main content

Python-Use: Python-Powered AI

Project description

Python use

Python use (aipython) is a Python command-line interpreter integrated with LLM.

What

Python use provides the entire Python execution environment to LLM. Imagine LLM sitting in front of a computer, typing various commands into the Python command-line interpreter, pressing Enter to execute, observing the results, and then typing and executing more code.

Unlike Agents, Python use does not define any tools interface. LLM can freely use all the features provided by the Python runtime environment.

Why

If you are a data engineer, you are likely familiar with the following scenarios:

  • Handling various data file formats: csv/excel, json, html, sqlite, parquet, etc.
  • Performing operations like data cleaning, transformation, computation, aggregation, sorting, grouping, filtering, analysis, and visualization.

This process often requires:

  • Starting Python, importing pandas as pd, and typing a bunch of commands to process data.
  • Generating a bunch of intermediate temporary files.
  • Describing your needs to ChatGPT/Claude, copying the generated data processing code, and running it manually.

So, why not start the Python command-line interpreter, directly describe your data processing needs, and let it be done automatically? The benefits are:

  • No need to manually input a bunch of Python commands temporarily.
  • No need to describe your needs to GPT, copy the program, and run it manually.

This is the problem Python use aims to solve!

How

Python use (aipython) is a Python command-line interpreter integrated with LLM. You can:

  • Enter and execute Python commands as usual.
  • Describe your needs in natural language, and aipython will automatically generate Python commands and execute them.

Moreover, the two modes can access data interchangeably. For example, after aipython processes your natural language commands, you can use standard Python commands to view various data.

Interfaces

ai Object

  • __call__(instruction): Execute the automatic processing loop until LLM no longer returns code messages
  • save(path): Save the interaction process to an svg or html file
  • llm Property: LLM object
  • runner Property: Runner object

LLM Object

  • history Property: Message history of the interaction process between the user and LLM

Runner Object

  • globals: Global variables of the Python environment executing the code returned by LLM
  • locals: Local variables of the Python environment executing the code returned by LLM

runtime Object

For the code generated by LLM to call, providing the following interface:

  • install_packages(packages): Request to install third-party packages
  • getenv(name, desc=None): Get environment variables
  • display(path=None, url=None): Display images in the terminal

Usage

AIPython has two running modes:

  • Task mode: Very simple and easy to use, just input your task, suitable for users unfamiliar with Python.
  • Python mode: Suitable for users familiar with Python, allowing both task input and Python commands, ideal for advanced users.

The default running mode is task mode, which can be switched to Python mode using the --python parameter.

Task Mode

uv run aipython

>>> Get the latest posts from Reddit r/LocalLLaMA
......
......
>>> /done

Python Mode

Basic Usage

Automatic task processing:

>>> ai("Get the title of Google's homepage")

Automatically Request to Install Third-Party Libraries

Python use - AIPython (Quit with 'exit()')
>>> ai("Use psutil to list all processes on MacOS")

📦 LLM requests to install third-party packages: ['psutil']
If you agree and have installed, please enter 'y [y/n] (n): y

TODO

  • Use AST to automatically detect and fix Python code returned by LLM

Thanks

  • Hei Ge: Product manager/senior user/chief tester
  • Sonnet 3.7: Generated the first version of the code, which was almost ready to use without modification.
  • ChatGPT: Provided many suggestions and code snippets, especially for the command-line interface.
  • Codeium: Intelligent code completion
  • Copilot: Code improvement suggestions and README translation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

python_use-0.1.19.tar.gz (159.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

python_use-0.1.19-py3-none-any.whl (36.8 kB view details)

Uploaded Python 3

File details

Details for the file python_use-0.1.19.tar.gz.

File metadata

  • Download URL: python_use-0.1.19.tar.gz
  • Upload date:
  • Size: 159.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.3

File hashes

Hashes for python_use-0.1.19.tar.gz
Algorithm Hash digest
SHA256 13853e7e8ff5737a311314cc2710245eaf674a1e916d18c5d731562481a9018b
MD5 722d69215f3caec48791c88aa6fa5814
BLAKE2b-256 54ab9ed96cab45e1e7e4b87884a03f04aa6027aee3c28b92b74d90bde32f065b

See more details on using hashes here.

File details

Details for the file python_use-0.1.19-py3-none-any.whl.

File metadata

  • Download URL: python_use-0.1.19-py3-none-any.whl
  • Upload date:
  • Size: 36.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.3

File hashes

Hashes for python_use-0.1.19-py3-none-any.whl
Algorithm Hash digest
SHA256 c7dfd72f70ab11002b2e19d998c8e538fdf4e069188565ad3b72e6e8e8f528bd
MD5 0f8f98e6ead25355d473a8989e67f3ae
BLAKE2b-256 85679fc7d385428c8e7cbb8824b4bdd2b297e407d35ea75af9b81b540660792e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page