Skip to main content

Fuzzy Self-Tuning PSO global optimization library

Project description

Fuzzy Self-Tuning PSO (FST-PSO) is a swarm intelligence global optimization method [1] based on Particle Swarm Optimization [2].

FST-PSO is designed for the optimization of real-valued multi-dimensional multi-modal minimization problems.

FST-PSO is settings-free version of PSO which exploits fuzzy logic to dynamically assign the functioning parameters to each particle in the swarm. Specifically, during each generation, FST-PSO is determines the optimal choice for the cognitive factor, the social factor, the inertia value, the minimum velocity, and the maximum velocity. FST-PSO also uses an heuristics to choose the swarm size, so that the user must not select any functioning setting.

In order to use FST-PSO, the programmer must implement a custom fitness function. Moreover, the programmer must specify the number of dimensions of the problem and the boundaries of the search space for each dimension. The programmer can optionally specify the maximum number of iterations. When the stopping criterion is met, FST-PSO returns the best fitting solution found, along with its fitness value.

Example

FST-PSO can be used as follows:

from fstpso import FuzzyPSO

def example_fitness( particle ):

return sum(map(lambda x: x**2, particle))

if __name__ == ‘__main__’:

dims = 10

FP = FuzzyPSO( )

FP.set_search_space( [[-10, 10]]*dims )

FP.set_fitness(example_fitness)

result = FP.solve_with_fstpso()

print “Best solution:”, result[0]

print “Whose fitness is:”, result[1]

Further information

FST-PSO has been created by M.S. Nobile, D. Besozzi, G. Pasi, G. Mauri, R. Colombo (University of Milan-Bicocca, Italy), and P. Cazzaniga (University of Bergamo, Italy). The source code was written by M.S. Nobile.

FST-PSO requires two packages: miniful and numpy.

Further information on GITHUB: <https://github.com/aresio/fst-pso>

[1] Nobile, Cazzaniga, Besozzi, Colombo, Mauri, Pasi, “Fuzzy Self-Tuning PSO: A Settings-Free Algorithm for Global Optimization”, Swarm & Evolutionary Computation, 2017 (doi:10.1016/j.swevo.2017.09.001)

[2] Kennedy, Eberhart, Particle swarm optimization, in: Proceedings IEEE International Conference on Neural Networks, Vol. 4, 1995, pp. 1942–1948

<http://www.sciencedirect.com/science/article/pii/S2210650216303534>

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
fst-pso-1.4.8.zip (14.7 kB) Copy SHA256 hash SHA256 Source None May 21, 2018

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page