Skip to main content

Cornflow is an open source multi-solver optimization server with a REST API built using flask.

Project description

https://github.com/baobabsoluciones/cornflow/workflows/build/badge.svg?style=svg https://github.com/baobabsoluciones/cornflow/workflows/docs/badge.svg?style=svg https://github.com/baobabsoluciones/cornflow/workflows/integration/badge.svg?style=svg https://img.shields.io/pypi/v/cornflow-client.svg?style=svg https://img.shields.io/pypi/pyversions/cornflow-client.svg?style=svg https://img.shields.io/badge/License-Apache2.0-blue

Cornflow is an open source multi-solver optimization server with a REST API built using flask, airflow and pulp.

While most deployment servers are based on the solving technique (MIP, CP, NLP, etc.), Cornflow focuses on the optimization problems themselves. However, it does not impose any constraint on the type of problem and solution method to use.

With Cornflow you can deploy a Traveling Salesman Problem solver next to a Knapsack solver or a Nurse Rostering Problem solver. As long as you describe the input and output data, you can upload any solution method for any problem and then use it with any data you want.

Cornflow helps you formalize your problem by proposing development guidelines. It also provides a range of functionalities around your deployed solution method, namely:

  • storage of users, instances, solutions and solution logs.

  • deployment and maintenance of models, solvers and algorithms.

  • scheduling of executions in remote machines.

  • management of said executions: start, monitor, interrupt.

  • centralizing of commercial licenses.

  • scenario storage and comparison.

  • user management, roles and groups.

Installation instructions

Cornflow is tested with Ubuntu 20.04, python >= 3.8 and git.

Download the Cornflow project and install requirements:

python3 -m venv venv
venv/bin/pip3 install cornflow

initialize the sqlite database:

source venv/bin/activate
export FLASK_APP=cornflow.app
export DATABASE_URL=sqlite:///cornflow.db
flask db upgrade
flask access_init
flask create_service_user  -u airflow -e airflow_test@admin.com -p airflow_test_password
flask create_admin_user  -u cornflow -e cornflow_admin@admin.com -p cornflow_admin_password

activate the virtual environment and run Cornflow:

source venv/bin/activate
export FLASK_APP=cornflow.app
export SECRET_KEY=THISNEEDSTOBECHANGED
export DATABASE_URL=sqlite:///cornflow.db
export AIRFLOW_URL=http://127.0.0.1:8080/
export AIRFLOW_USER=airflow_user
export AIRFLOW_PWD=airflow_pwd
flask run

Cornflow needs a running installation of Airflow to operate and more configuration. Check the installation docs for more details on installing airflow, configuring the application and initializing the database.

Using cornflow to solve a PuLP model

We’re going to test the cornflow server by using the cornflow-client and the pulp python package:

pip install cornflow-client pulp

Initialize the api client:

from cornflow_client import CornFlow
email = 'some_email@gmail.com'
pwd = 'Some_password1'
username = 'some_name'
client = CornFlow(url="http://127.0.0.1:5000")

Create a user:

config = dict(username=username, email=email, pwd=pwd)
client.sign_up(**config)

Log in:

client.login(username=username, pwd=pwd)

Prepare an instance:

import pulp
prob = pulp.LpProblem("test_export_dict_MIP", pulp.LpMinimize)
x = pulp.LpVariable("x", 0, 4)
y = pulp.LpVariable("y", -1, 1)
z = pulp.LpVariable("z", 0, None, pulp.LpInteger)
prob += x + 4 * y + 9 * z, "obj"
prob += x + y <= 5, "c1"
prob += x + z >= 10, "c2"
prob += -y + z == 7.5, "c3"
data = prob.to_dict()
insName = 'test_export_dict_MIP'
description = 'very small example'

Send instance:

instance = client.create_instance(data, name=insName, description=description, schema="solve_model_dag",)

Solve an instance:

config = dict(
    solver = "PULP_CBC_CMD",
    timeLimit = 10
)
execution = client.create_execution(
    instance['id'], config, name='execution1', description='execution of a very small instance',
    schema="solve_model_dag",
)

Check the status of an execution:

status = client.get_status(execution["id"])
print(status['state'])
# 1 means "finished correctly"

Retrieve a solution:

results = client.get_solution(execution['id'])
print(results['data'])
# returns a json with the solved pulp object
_vars, prob = pulp.LpProblem.from_dict(results['data'])

Retrieve the log of the solver:

log = client.get_log(execution['id'])
print(log['log'])
# json format of the solver log

Using cornflow to deploy a solution method

To deploy a cornflow solution method, the following tasks need to be accomplished:

  1. Create an Application for the new problem

  2. Do a PR to a compatible repo linked to a server instance (e.g., like this one).

For more details on each part, check the deployment guide.

Using cornflow to solve a problem

For this example we only need the cornflow_client package. We will test the graph-coloring demo defined here. We will use the test server to solve it.

Initialize the api client:

from cornflow_client import CornFlow
email = 'readme@gmail.com'
pwd = 'some_password'
username = 'some_name'
client = CornFlow(url="https://devsm.cornflow.baobabsoluciones.app/")
client.login(username=username, pwd=pwd)

solve a graph coloring problem and get the solution:

data = dict(pairs=[dict(n1=0, n2=1), dict(n1=1, n2=2), dict(n1=1, n2=3)])
instance = client.create_instance(data, name='gc_4_1', description='very small gc problem', schema="graph_coloring")
config = dict()
execution = client.create_execution(
    instance['id'], config, name='gc_4_1_exec', description='execution of very small gc problem',
    schema="graph_coloring",
)
status = client.get_status(execution["id"])
print(status['state'])
solution = client.get_solution(execution["id"])
print(solution['data']['assignment'])

Running tests and coverage

Then you have to run the following commands:

export FLASK_ENV=testing

Finally you can run all the tests with the following command:

python -m unittest discover -s cornflow.tests

If you want to only run the unit tests (without a local airflow webserver):

python -m unittest discover -s cornflow.tests.unit

If you want to only run the integration test with a local airflow webserver:

python -m unittest discover -s cornflow.tests.integration

After if you want to check the coverage report you need to run:

coverage run  --source=./cornflow/ -m unittest discover -s=./cornflow/tests/
coverage report -m

or to get the html reports:

coverage html

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cornflow-1.1.2.tar.gz (130.1 kB view details)

Uploaded Source

Built Distribution

cornflow-1.1.2-py3-none-any.whl (198.6 kB view details)

Uploaded Python 3

File details

Details for the file cornflow-1.1.2.tar.gz.

File metadata

  • Download URL: cornflow-1.1.2.tar.gz
  • Upload date:
  • Size: 130.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for cornflow-1.1.2.tar.gz
Algorithm Hash digest
SHA256 fb5248f833fc97b21f1284451a98899968c40f22afe4b08da81fd645a2d85a9e
MD5 d27320c6361e831664ad2243b05d4fa2
BLAKE2b-256 c1dcb9a71e0783c18c57043fd06d133dbe7746efa236999748a541e73ffc3fb1

See more details on using hashes here.

File details

Details for the file cornflow-1.1.2-py3-none-any.whl.

File metadata

  • Download URL: cornflow-1.1.2-py3-none-any.whl
  • Upload date:
  • Size: 198.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for cornflow-1.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b019c6bfe6f1b1f67bfe341464b231a0410562d1452180aef3cff05e3fc3c991
MD5 b1eb971b51a85c6a240c03828fa2826c
BLAKE2b-256 1f09de04b9784230a9e0b3e5bcbd8032dee891018e37d919773fa99632a9e625

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page