Skip to main content

Python app using llm via MCP for modeling and solving a csp problem in pycsp3

Project description

💡 About csp-llm

csp-llm is a python package running as an AI agent to enable the automatic generation and execution of PyCSP3 code for a constraint problem.

It provides an interactive and customizable web user interface, which allows the user to enter or import the description of a constraint problem in natural language. The problem description is then sent to a pre-configured LLM model, which will generate and display the code to the user. The user can either request that the code be executed directly, or make modifications to the generated code before requesting its execution.

The application incorporates modern technologies to run any type of LLM model (LLM models deployed within CRIL, Anthropic models, OpenAI models, Google models, etc.).

Once installed, the application offers a few examples of constraint problems with which the user can have fun testing.

NB: It may happen that the generated code contains errors, in which case the user is given the opportunity to correct them via the interface.

🛑 Requirements

  • Run on Linux and Mac platforms (tested on bash linux and zsh mac).
  • Have access to an LLM platform. LLM templates from CRIL are proposed by default. Those with a LAN account can use their API key. For further information, please contact Alain Kemgue( kemgue@cril.fr )
  • Have installed a version of python3 (3.10 or higher)
  • Have installed a version of java to run pycsp3 (java 8 or higher)

📦 Installation

We recommend installing the application in a python virtual environment.

Virtual environment installation

python3 -m venv venv
source venv/bin/activate

Installing the csp-llm package on PyPi

pip install csp-llm

Launch the application

(venv) ordi@alain% launch-csp-llm     
🚀 Launching the application...
💡 Application dependencies
missing ScriptRunContext! This warning can be ignored when running in bare mode.
✅ anthropic 0.55.0
✅ openai 1.92.2
✅ streamlit 1.46.1
✅ streamlit_ace 0.1.1
✅ dotenv
✅ pycsp3 2.5.1
✅ Java 21.0.7 detected (>= 8)
🌐 Application available at: http://localhost:8501
💡 Press Ctrl+C to stop
--------------------------------------------------

  You can now view your Streamlit app in your browser.

  URL: http://localhost:8501

**************************************************

The application is then available at http://localhost:8501

You can change port and host by passing parameters to the launch script.

(venv) ordi@alain% launch-csp-llm --help              
usage: launch-csp-llm [-h] [--host HOST] [--port PORT] [-ev]

Launch the application

options:
  -h, --help   show this help message and exit
  --host HOST
  --port PORT
  -ev

Example of launch on port 3000 and host 0.0.0.0( makes the application accessible on the entire network )

(venv) ordi@alain% launch-csp-llm --port 3000 --host 0.0.0.0
🚀 Launching the application...
💡 Application dependencies
✅ anthropic 0.55.0
✅ openai 1.92.2
✅ streamlit 1.46.1
✅ streamlit_ace 0.1.1
✅ dotenv
✅ pycsp3 2.5.1
✅ Java 21.0.7 detected (>= 8)
🌐 Application available at: http://0.0.0.0:3000
💡 Press Ctrl+C to stop
--------------------------------------------------

  You can now view your Streamlit app in your browser.

  URL: http://0.0.0.0:3000

**************************************************

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

csp_llm-0.2.1-py3-none-any.whl (16.9 kB view details)

Uploaded Python 3

File details

Details for the file csp_llm-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: csp_llm-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 16.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for csp_llm-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0eea214f0d2f73d0b5efa0ec9fee337aa71f3bb53b7c7bafdf5631a297f6595f
MD5 5f153f56c046a7582533fdec6da07360
BLAKE2b-256 884261acdd09f06a25a2e9daa7451c7d0917743aee53143b31cb0a961f72b600

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page