The Revised Stochastic Simplex Bisection Algorithm and Particle Swarm Optimization
Christer Samuelsson
DOI: http://dx.doi.org/10.15439/2017F119
Citation: Communication Papers of the 2017 Federated Conference on Computer Science and Information Systems, M. Ganzha, L. Maciaszek, M. Paprzycki (eds). ACSIS, Vol. 13, pages 103–110 (2017)
Abstract. The stochastic simplex bisection (SSB) algorithm is evaluated against particle swarm optimization (PSO) on a prominent test set. The original SSB algorithm performs on par with the PSO algorithm and a revised version of the SSB algorithm outperforms both of them. Detailed analysis of the performance on select objective functions brings to light key properties of the three algorithms. The core SSB algorithm is here viewed as a sampling tool for an outer loop that employs statistical pattern recognition. This opens the door for a host of other schemes.
References
- M. Wahde, Biologically Inspired Optimization Algorithms. WIT Press, 2008.
- D. E. Goldberg and J. H. Holland, “Genetic algorithms and machine learning,” Machine Learning, vol. 3, no. 2-3, pp. 95–99, 1988. http://dx.doi.org/10.1023/A:1022602019183
- J. Kennedy and R. Eberhart, “Particle swarm optimization,” 1995. http://dx.doi.org/10.1109/ICNN.1995.488968
- M. Dorigo, V. Maniezzo, and A. Colorni, “The ant system: Optimization by a colony of cooperating agents,” IEEE Trans. on Systems, Man, and Cybernetics, vol. 26, no. 1, pp. 29–41, 1996. http://dx.doi.org/10.1109/3477.484436
- C. Samuelsson, “The stochastic simplex bisection algorithm,” in Procs. 15th Int.’l Conf. Computational Science, ser. ICCS/15. Elsevier, 2015. http://dx.doi.org/10.1016/j.procs.2015.05.215 pp. 855–864. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S1877050915010236
- C. Samuelsson, “Comparative evaluation of the stochastic simplex bisection algorithm and the scipy.optimize module,” in Procs. 8th Workshop on Computational Optimization, ser. WCO/15, 2015, pp. 573–578.
- X.-S. Yang, “Firefly algorithms for multimodal optimization,” in Procs. 5th Int.’l Conf. Stochastic Algorithms: Foundations and Applications, ser. SAGA’09. Springer-Verlag, 2009. http://dx.doi.org/10.1007/978-3-642-04944-6_14. ISBN 3-642-04943-5, 978-3-642-04943-9 pp. 169–178. [Online]. Available: http://dl.acm.org/citation.cfm?id=1814087.1814105
- C. J. A. B. Filho, F. B. de Lima Neto, A. J. da C. C. Lins, A. I. S. Nascimento, and M. P. Lima, “A novel search algorithm based on fish school behavior,” in Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Singapore, 12-15 October 2008, 2008. http://dx.doi.org/10.1109/ICSMC.2008.4811695 pp. 2646–2651. [Online]. Available: http://dx.doi.org/10.1109/ICSMC.2008.4811695
- M. Haiman, “A simple and relatively efficient triangulation of the n-cube,” Discrete & Computational Geometry, vol. 6, no. 1, pp. 287–289, 1991. http://dx.doi.org/10.1007/BF02574690. [Online]. Available: http://dx.doi.org/10.1007/BF02574690
- R. B. Hughes and M. R. Anderson, “Simplexity of the cube,” Discrete Mathematics, vol. 158, no. 1-3, pp. 99 – 150, 1996. http://dx.doi.org/http://dx.doi.org/10.1016/0012-365X(95)00075-8. [Online]. Available: http://www.sciencedirect.com/science/article/pii/0012365X95000758
- C. M. Bishop, Machine Learning and Pattern Recognition. Springer Verlag, 2006.
- A. I. Vaz and L. N. Vicente, “A particle swarm pattern search method for bound constrained global optimization,” J. of Global Optimization, vol. 39, no. 2, pp. 197–219, Oct. 2007. http://dx.doi.org/10.1007/s10898-007-9133-5. [Online]. Available: http://dx.doi.org/10.1007/s10898-007-9133-5
- Q. Duan, S. Sorooshian, and V. Gupta, “Effective and efficient global optimization for conceptual rainfall-runoff models,” Water resources research, vol. 28, no. 4, pp. 1015–1031, 1992. http://dx.doi.org/10.1029/91WR02985
- N. Hansen and S. Kern, “Evaluating the CMA evolution strategy on multimodal test functions,” in Parallel Problem Solving from Nature VIII, X. Yao et al., Eds. Springer, 2004. http://dx.doi.org/10.1007/978-3-540-30217-9_29 pp. 282–291.
- P. Kaelo and M. M. Ali, “Some variants of the controlled random search algorithm for global optimization,” J. Optim. Theory Appl, vol. 130, no. 2, pp. 253–264, 2006. http://dx.doi.org/10.1007/s10957-006-9101-0
- K. V. Price, R. M. Storn, and J. A. Lampinen, Differential Evolution - A Practical Approach to Global Optimization, ser. Natural Computing. Springer-Verlag, 2006, iSBN 540209506.
- L. J. Hong and B. L. Nelson, “Discrete optimization via simulation using compass,” Operations Research, vol. 54, no. 1, pp. 115–129, 2006.
- N. Andréassson, A. Egrafov, and M. Patriksson, An Introduction to Continuous Optimization. Studentlitteratur, 2005.
- D. R. Jones, C. D. Perttunen, and B. E. Stuckman, “Lipschitzian optimization without the Lipschitz constant,” J. Optim. Theory Appl., vol. 79, no. 1, pp. 157–181, Oct. 1993. http://dx.doi.org/10.1007/BF00941892. [Online]. Available: http://dx.doi.org/10.1007/BF00941892
- R. Tang, S. Fong, X.-S. Yang, and S. Deb, “Integrating nature-inspired optimization algorithms to k-means clustering.” in ICDIM, S. Fong, P. Pichappan, S. Mohammed, P. Hung, and S. Asghar, Eds. IEEE, 2012. ISBN 978-1-4673-2428-1 pp. 116–123.
- X.-S. Yang, Nature-Inspired Optimization Algorithms. Elsevier, 2014.
- Wikipedia, “Test functions for optimization,” http://en.wikipedia.org/wiki/Test_functions_for_optimization.html, 2014, accessed: 2014-12-14.
- K. Mullen, D. Ardia, D. Gil, D. Windover, and J. Cline, “DEoptim: An R package for global optimization by differential evolution,” J. of Stat. Software, vol. 40, no. 6, pp. 1–26, 2011. [Online]. Available: http://www.jstatsoft.org/v40/i06/
- A. Gavana, “Global optimization benchmarks and AMPGO,” http://infinity77.net/global_optimization, 2015, accessed: 2015-01-09.
- Wikipedia, “Sign test,” https://en.wikipedia.org/wiki/Sign_test, 2017, accessed: 2017-06-19.
- Wikipedia, “Wilcoxon signed-rank test,” https://en.wikipedia.org/wiki/Wilcoxon_signed-rank_test, 2017, accessed: 2017-06-19. ISBN 9780124167452, 9780124167438