Skip to main content

Fuzzy Self-Tuning PSO global optimization library

Project description

Fuzzy Self-Tuning PSO (FST-PSO) is a swarm intelligence global optimization method [1] based on Particle Swarm Optimization [2].

FST-PSO is designed for the optimization of real-valued multi-dimensional multi-modal minimization problems.

FST-PSO is settings-free version of PSO which exploits fuzzy logic to dynamically assign the functioning parameters to each particle in the swarm. Specifically, during each generation, FST-PSO is determines the optimal choice for the cognitive factor, the social factor, the inertia value, the minimum velocity, and the maximum velocity. FST-PSO also uses an heuristics to choose the swarm size, so that the user must not select any functioning setting.

In order to use FST-PSO, the programmer must implement a custom fitness function. Moreover, the programmer must specify the number of dimensions of the problem and the boundaries of the search space for each dimension. The programmer can optionally specify the maximum number of iterations. When the stopping criterion is met, FST-PSO returns the best fitting solution found, along with its fitness value.

Example

FST-PSO can be used as follows:

from fstpso import FuzzyPSO

def example_fitness( particle ):

return sum(map(lambda x: x**2, particle))

if __name__ == ‘__main__’:

dims = 10

FP = FuzzyPSO( )

FP.set_search_space( [[-10, 10]]*dims )

FP.set_fitness(example_fitness)

result = FP.solve_with_fstpso()

print(“Best solution:”, result[0])

print(“Whose fitness is:”, result[1])

Further information

FST-PSO has been created by M.S. Nobile, D. Besozzi, G. Pasi, G. Mauri, R. Colombo (University of Milan-Bicocca, Italy), and P. Cazzaniga (University of Bergamo, Italy). The source code was written by M.S. Nobile.

FST-PSO requires two packages: miniful and numpy.

Further information on GITHUB: <https://github.com/aresio/fst-pso>

[1] Nobile, Cazzaniga, Besozzi, Colombo, Mauri, Pasi, “Fuzzy Self-Tuning PSO: A Settings-Free Algorithm for Global Optimization”, Swarm & Evolutionary Computation, 39:70-85, 2018 (doi:10.1016/j.swevo.2017.09.001)

[2] Kennedy, Eberhart, Particle swarm optimization, in: Proceedings IEEE International Conference on Neural Networks, Vol. 4, 1995, pp. 1942–1948

<http://www.sciencedirect.com/science/article/pii/S2210650216303534>

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fst-pso-1.8.1.tar.gz (18.3 kB view details)

Uploaded Source

File details

Details for the file fst-pso-1.8.1.tar.gz.

File metadata

  • Download URL: fst-pso-1.8.1.tar.gz
  • Upload date:
  • Size: 18.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.3

File hashes

Hashes for fst-pso-1.8.1.tar.gz
Algorithm Hash digest
SHA256 b3d16ec27b0b4d36b35b306af40c00cd0b34e5e0a9e30a71ed02490e8954a26a
MD5 08fca288b69e5c7fc629c5dc0318b474
BLAKE2b-256 5ad7f7f93c41fde5b8c1f9d52cc0f9a104a56eca13dc6876c6d2f967ddef88d7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page