Skip to main content

Python app using llm via MCP for modeling and solving a csp problem in pycsp3

Project description

💡 About csp-llm

csp-llm is a python package running as an AI agent to enable the automatic generation and execution of PyCSP3 code for a constraint problem.

It provides an interactive and customizable web user interface, which allows the user to enter or import the description of a constraint problem in natural language. The problem description is then sent to a pre-configured LLM model, which will generate and display the code to the user. The user can either request that the code be executed directly, or make modifications to the generated code before requesting its execution.

The application incorporates modern technologies to run any type of LLM model (LLM models deployed within CRIL, Anthropic models, OpenAI models, Google models, etc.).

You can run the tool in mutiple environments including your local LLM environment with LMSTUDIO, your local environment with OLLAMA and finally with the default environment composed of external modern LLMs and those deployed in the CRIL environment. To get good results in your local environment, make sure you have a machine with good GPU performance where you have installed large and recent versions of the LLM models. Thus, you have the possibility here to test different models.

You will see further down in this document the option to choose to launch in a particular environment.

Once installed, the application offers a few examples of constraint problems with which the user can have fun testing.

P.S: It may happen that the generated code contains errors, in which case the user is given the opportunity to correct it via the interface. Also note that LLMs can make mistakes and hallucinations, so more efficient model is, more likely the generated code will contain fewer errors.

🛑 Requirements

  • Run on Linux and Mac platforms (tested on bash linux and zsh mac).
  • Have access to an LLM platform. LLM templates from CRIL are proposed by default. Those with a LAN account can use their API key. For further information, please contact Alain Kemgue( kemgue@cril.fr )
  • Have installed a version of python3 (3.10 or higher)
  • Have installed a version of java to run pycsp3 (java 8 or higher)

📦 Installation

We recommend installing the application in a python virtual environment.

Virtual environment installation

python3 -m venv venv
source venv/bin/activate

Installing the csp-llm package on PyPi

pip install csp-llm

Launch the application

(venv) ordi@alain% launch-csp-llm     
🚀 Launching the application...
💡 Application dependencies
missing ScriptRunContext! This warning can be ignored when running in bare mode.
✅ anthropic 0.60.0
✅ openai 1.97.1
✅ streamlit 1.47.1
✅ streamlit_ace 0.1.1
✅ dotenv
✅ aiohttp 3.12.15
✅ pycsp3 2.5.1
✅ Java 21.0.8 detected (>= 8)
🌐 Application available at: http://localhost:8501
💡 LLM environment : GENERIC_CRIL_LLM
💡 Press Ctrl+C to stop
--------------------------------------------------

  You can now view your Streamlit app in your browser.

  URL: http://localhost:8501

The application is then available at http://localhost:8501 with generic external llm models.

You can change port and host by passing parameters to the launch script.

(venv) ordi@alain% launch-csp-llm --help              
usage: csp_llm.py [-h] [--host HOST] [--port PORT] [--llm_env {local_ollama,local_lmstudio,generic_cril_llm}] [-ev]

Launch the application

options:
  -h, --help            show this help message and exit
  --host HOST           ip host of the web server. Default host is 'localhost'. Use 0.0.0.0 to make app accessible on the entire network 
  --port PORT           server's listening port. Default port is 8501
  --llm_env {local_ollama,local_lmstudio,generic_cril_llm}
                        default value is 'generic_cril_llm' 
                        use --llm_env option to specify your LLM environnement 
                        value 'local_ollama' to use local ollama installed on your computer 
                        value 'local_lmstudio' to use local lmstudio installed on your computer 
                        value 'generic_cril_llm' to use external LLM + CRIL plateform
  -ev                   verbose mode

Example of launching the application on port 3000 and host 0.0.0.0( makes the application accessible on the entire network ) and in a local lmstudio environment

(venv) ordi@alain% launch-csp-llm --port 3000 --host 0.0.0.0  --llm_env local_lmstudio
🚀 Launching the application...
💡 Application dependencies
✅ anthropic 0.60.0
✅ openai 1.97.1
✅ streamlit 1.47.1
✅ streamlit_ace 0.1.1
✅ dotenv
✅ aiohttp 3.12.15
✅ pycsp3 2.5.1
✅ Java 21.0.8 detected (>= 8)
🌐 Application available at: http://0.0.0.0:3000
💡 LLM environment : LOCAL_LMSTUDIO
💡 Press Ctrl+C to stop
--------------------------------------------------

  You can now view your Streamlit app in your browser.

  URL: http://0.0.0.0:3000

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

csp_llm-0.2.2-py3-none-any.whl (19.0 kB view details)

Uploaded Python 3

File details

Details for the file csp_llm-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: csp_llm-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 19.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for csp_llm-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 8547864b96fbb52dacefea0472150869045dcee92a80e8beb1c6fa33b72636d7
MD5 7988370dfa652d00aba6d21a0bcda2fe
BLAKE2b-256 5c19cf469b5ddd91fa5007e14099b114db26c82a4dca22ccbb03608ba7e502b5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page