Skip to main content

A pipeline to call a traffic simulator: SUMO

Project description

sumo_docker_pipeline


The package sumo_docker_pipeline enables you to run a traffic simulator SUMO efficiently and to interact with Python easily. The package is valid when you need to run SUMO iteratively.

SUMO is often tricky to install locally because of its dependencies. Thus, it's a straightforward idea to run SUMO in a docker container.

However, another issue arises when we run SUMO in a docker container. It is challenging to construct a pipeline between SUMO and API.

The package provides an easy-to-use API; at the same time, SUMO runs in a docker container.

Requirement

  • python > 3.5
  • docker
  • docker-compose

Install

Pull the image (or build of a docker image with SUMO)

The existing image is on the Dockerhub.

docker pull kensukemi/sumo-ubuntu18

If you prefer to build with yourself, you run the following command.

docker-compose build 

Install a python package

make install

Example case: iterative run with parameter updates

Let's say that you want to run SUMO iteratively. At the same time, you want to change input parameters depending on the results of a simulation.

In that case, you need to check the output of SUMO and update the parameters. sumo_docker_pipeline package enables you to make the process automatic.

Setups

  1. creation of a directory where you save SUMO's configuration.
  2. creation of template-files of SUMO's configuration.
  3. running the pipeline.

1. creation of a directory

It is a directory that SUMO accesses. Let's say that we create test/resources/config_template

2. creation of SUMO's configuration

You prepare configuration files which SUMO requires. The format of the conf. files are totally same as SUMO's requirements.

The only difference is that you write wildcard ? at the place where you wanna replace during pipeline.

For example, tests/resources/config_template/grid.flows.xml has the following element,

<vType vClass="passenger" id="passenger"  tau="0.5" speedDev="0.1" maxSpeed="?" minGap="?" accel="?" decel="?" latAlignment="center" />

With the sumo_docker_pipeline package, you can replace the attributes with the wildcards ?.

3. running the pipeline

See examples directory.

For developer

pytest tests

license and credit

The source code is licensed MIT. The website content is licensed CC BY 4.0.

@misc{sumo-docker-pipeline,
  author = {Kensuke Mitsuzawa},
  title = {sumo-docker-pipeline},
  year = {2021},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/Kensuke-Mitsuzawa/sumo_docker_pipeline}},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sumo_docker_pipeline-2.0.tar.gz (13.4 kB view details)

Uploaded Source

Built Distribution

sumo_docker_pipeline-2.0-py3-none-any.whl (16.9 kB view details)

Uploaded Python 3

File details

Details for the file sumo_docker_pipeline-2.0.tar.gz.

File metadata

  • Download URL: sumo_docker_pipeline-2.0.tar.gz
  • Upload date:
  • Size: 13.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.7 CPython/3.8.8 Linux/5.4.0-91-lowlatency

File hashes

Hashes for sumo_docker_pipeline-2.0.tar.gz
Algorithm Hash digest
SHA256 c1e84f46d4502d8178760f0dc2b56b862de2807a22de44043554f6627f6398fe
MD5 69342b887503328040d8c6156dcf151f
BLAKE2b-256 6bf90d9a16f92fcc2c7891a2b29c24362f048fb3805d8ef7916b2f4a6ad9923a

See more details on using hashes here.

File details

Details for the file sumo_docker_pipeline-2.0-py3-none-any.whl.

File metadata

  • Download URL: sumo_docker_pipeline-2.0-py3-none-any.whl
  • Upload date:
  • Size: 16.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.7 CPython/3.8.8 Linux/5.4.0-91-lowlatency

File hashes

Hashes for sumo_docker_pipeline-2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 695dc854cf53d96bb743881d37d2ec268a35c1d074c1979bd38914a57fab9289
MD5 93ede42ba6e3ce669332d9ce275bcb5b
BLAKE2b-256 539b4088ba0774c634d63c7757e841ffdce1820d3f90967e66a9eced21b17bce

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page