Skip to main content

Sparkle is a Programming by Optimisation (PbO)-based problem-solving platform designed to enable the widespread and effective use of PbO techniques for improving the state-of-the-art in solving a broad range of prominent AI problems, including SAT and AI Planning.

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

Sparkle

Tests tests status Coverage Status linter docs

A Programming by Optimisation (PbO)-based problem-solving platform designed to enable the widespread and effective use of PbO techniques for improving the state-of-the-art in solving a broad range of prominent AI problems, including SAT and AI Planning.

Specifically, Sparkle facilitates the use of:

  • Automated algorithm configuration
  • Automated algorithm selection

Furthermore, Sparkle handles various tasks for the user such as:

  • Algorithm meta information collection and statistics calculation
  • Instance/Data Set management and feature extraction
  • Compute cluster job submission and monitoring
  • Log file collection

Installation

The quick and full installation of Sparkle can be done using Conda (For Conda installation see here).

Simply download the environment.yml file from the Github with wget:

wget https://raw.githubusercontent.com/ADA-research/Sparkle/main/environment.yml

and run:

conda env create -f environment.yml

The installation of the environment may take up to five minutes depending on your internet connection. Once the environment has been created it can be activated by:

conda activate sparkle
The creation of the Conda environment also takes care of the installation of the Sparkle package itself. 
You will need to reactivate the environment every time you start the terminal, before using Sparkle.

Sparkle can also be installed as a standalone package using Pip. We recommend creating a new virtual environment (For example, venv) before to ensure no clashes between dependencies occur.

pip install SparkleAI

Note that a direct installation through Pip does not handle certain dependencies of the Sparkle CLI, such as the required libraries for compiling RunSolver.

Install dependencies

Asside from several package dependencies, Sparkle's package / CLI relies on a few user supplied executables:

  • LaTex compiler (pdflatex) for report generation
  • Java, tested with version 1.8.0_402, in order to use SMAC2
  • R, tested with version 4.3.1 in order to use IRACE

Other dependencies are handled by the Conda environment, but if that is not an option for you please ensure you have the following:

For detailed installation instructions see the documentation: https://ada-research.github.io/Sparkle/

Developer installation

The file dev-env.yml is used for developer mode of the Sparkle package and contains several extra packages for testing.

The two environments can be created in parallel since one is named sparkle and the other sparkle-dev. If you want to update an environment it is better to do a clean installation by removing and recreating it. For example:

conda deactivate
conda env remove -n sparkle
conda env create -f environment.yml
conda activate sparkle

This should be fast as both conda and pip use local cache for the packages.

Examples

See the Examples directory for some examples on how to use Sparkle. All Sparkle CLI commands need to be executed from the root of the initialised Sparkle directory.

Documentation

The documentation can be read at https://ada-research.github.io/Sparkle/.

A PDF is also available in the repository.

Licensing

Sparkle is distributed under the MIT licence

Component licences

Sparkle is distributed with a number of external components, solvers, and instance sets. Descriptions and licensing information for each these are included in the sparkle/Components and Examples/Resources/ directories.

The SATzilla 2012 feature extractor is used from http://www.cs.ubc.ca/labs/beta/Projects/SATzilla/ with some modifications. The main modification of this component is to disable calling the SAT instance preprocessor called SatELite. It is located in: Examples/Resources/Extractors/SAT-features-competition2012_revised_without_SatELite_sparkle/

Citation

If you use Sparkle for one of your papers and want to cite it, please cite our paper describing Sparkle: K. van der Blom, H. H. Hoos, C. Luo and J. G. Rook, Sparkle: Toward Accessible Meta-Algorithmics for Improving the State of the Art in Solving Challenging Problems, in IEEE Transactions on Evolutionary Computation, vol. 26, no. 6, pp. 1351-1364, Dec. 2022, doi: 10.1109/TEVC.2022.3215013.

@article{BloEtAl22,
  title={Sparkle: Toward Accessible Meta-Algorithmics for Improving the State of the Art in Solving Challenging Problems}, 
  author={van der Blom, Koen and Hoos, Holger H. and Luo, Chuan and Rook, Jeroen G.},
  journal={IEEE Transactions on Evolutionary Computation}, 
  year={2022},
  volume={26},
  number={6},
  pages={1351--1364},
  doi={10.1109/TEVC.2022.3215013}
}

Maintainers

Thijs Snelleman, Jeroen Rook, Holger H. Hoos,

Contributors

Chuan Luo, Richard Middelkoop, Jérémie Gobeil, Sam Vermeulen, Marcel Baumann, Jakob Bossek, Tarek Junied, Yingliu Lu, Malte Schwerin, Aaron Berger, Marie Anastacio, Aaron Berger Koen van der Blom, Noah Peil, Brian Schiller

Contact

sparkle@aim.rwth-aachen.de

Sponsors

The development of Sparkle is partially sponsored by the Alexander von Humboldt foundation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

SparkleAI-0.9.1.tar.gz (30.0 MB view details)

Uploaded Source

File details

Details for the file SparkleAI-0.9.1.tar.gz.

File metadata

  • Download URL: SparkleAI-0.9.1.tar.gz
  • Upload date:
  • Size: 30.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.4

File hashes

Hashes for SparkleAI-0.9.1.tar.gz
Algorithm Hash digest
SHA256 90e51d6919466692be9ed5450a6cbc3165f22301164dbf448f4cfbbba5cadee0
MD5 4730c4df7489ab72141f555244679f11
BLAKE2b-256 a4fba1571c7f5163731071592cef5ac93b164bf959b4e79dc0b827d43c26b129

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page