Skip to main content

Fuzzy Self-Tuning PSO global optimization library

Project description

Fuzzy Self-Tuning PSO (FST-PSO) is a swarm intelligence global optimization method [1] based on Particle Swarm Optimization [2].

FST-PSO is designed for the optimization of real-valued multi-dimensional multi-modal minimization problems.

FST-PSO is settings-free version of PSO which exploits fuzzy logic to dynamically assign the functioning parameters to each particle in the swarm. Specifically, during each generation, FST-PSO is determines the optimal choice for the cognitive factor, the social factor, the inertia value, the minimum velocity, and the maximum velocity. FST-PSO also uses an heuristics to choose the swarm size, so that the user must not select any functioning setting.

In order to use FST-PSO, the programmer must implement a custom fitness function. Moreover, the programmer must specify the number of dimensions of the problem and the boundaries of the search space for each dimension. The programmer can optionally specify the maximum number of iterations. When the stopping criterion is met, FST-PSO returns the best fitting solution found, along with its fitness value.

Example

FST-PSO can be used as follows:

from fstpso import FuzzyPSO

def example_fitness( particle ):

return sum(map(lambda x: x**2, particle))

if __name__ == ‘__main__’:

dims = 10

FP = FuzzyPSO( )

FP.set_search_space( [[-10, 10]]*dims )

FP.set_fitness(example_fitness)

result = FP.solve_with_fstpso()

print “Best solution:”, result[0]

print “Whose fitness is:”, result[1]

Further information

FST-PSO has been created by M.S. Nobile, D. Besozzi, G. Pasi, G. Mauri, R. Colombo (University of Milan-Bicocca, Italy), and P. Cazzaniga (University of Bergamo, Italy). The source code was written by M.S. Nobile.

FST-PSO requires two packages: miniful and numpy.

Further information on GITHUB: <https://github.com/aresio/fst-pso>

[1] Nobile, Cazzaniga, Besozzi, Colombo, Mauri, Pasi, “Fuzzy Self-Tuning PSO: A Settings-Free Algorithm for Global Optimization”, Swarm & Evolutionary Computation, 2017 (doi:10.1016/j.swevo.2017.09.001)

[2] Kennedy, Eberhart, Particle swarm optimization, in: Proceedings IEEE International Conference on Neural Networks, Vol. 4, 1995, pp. 1942–1948

<http://www.sciencedirect.com/science/article/pii/S2210650216303534>

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fst_pso-1.9.0.tar.gz (22.4 kB view details)

Uploaded Source

Built Distribution

fst_pso-1.9.0-py3-none-any.whl (25.0 kB view details)

Uploaded Python 3

File details

Details for the file fst_pso-1.9.0.tar.gz.

File metadata

  • Download URL: fst_pso-1.9.0.tar.gz
  • Upload date:
  • Size: 22.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.7

File hashes

Hashes for fst_pso-1.9.0.tar.gz
Algorithm Hash digest
SHA256 ce77f503f55ccf910b14616ba43cdd8fc3b75c40f1a6efa60826f7e467d6a8df
MD5 0211c81103f2fb08a49a06841d6ed064
BLAKE2b-256 ee3a3777f17888bb47329bf21444b03aa128035d45d6028a6d5289c2e965084b

See more details on using hashes here.

File details

Details for the file fst_pso-1.9.0-py3-none-any.whl.

File metadata

  • Download URL: fst_pso-1.9.0-py3-none-any.whl
  • Upload date:
  • Size: 25.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.7

File hashes

Hashes for fst_pso-1.9.0-py3-none-any.whl
Algorithm Hash digest
SHA256 53eca7721fc02979ba8734ff25eaa1ea204376155c294d68c7f5191ef2fda07a
MD5 76b686baf493e6b34980b8e68fcea6c9
BLAKE2b-256 30a2a541e2a78d17679100b994bca08587f7cb4c84f02925fa255f87e6ef44f3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page