Logo PTI Logo FedCSIS

Proceedings of the 16th Conference on Computer Science and Intelligence Systems

Annals of Computer Science and Information Systems, Volume 25

Optimized Method based on Lattice Sequences for Multidimensional Integrals in Neural Networks

, ,

DOI: http://dx.doi.org/10.15439/2021F53

Citation: Proceedings of the 16th Conference on Computer Science and Intelligence Systems, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 25, pages 243246 ()

Full text

Abstract. In this work we investigate advanced stochastic methods for solving a specific multidimensional problem related to neural networks. Monte Carlo and quasi-Monte Carlo techniques have been developed over many years in a range of different fields, but have only recently been applied to the problems in neural networks. As well as providing a consistent framework for statistical pattern recognition, the stochastic approach offers a number of practical advantages including a solution to the problem for higher dimensions. For the first time multidimensional integrals up to 100 dimensions related to this area will be discussed in our numerical study.

References

  1. N. Bahvalov (1959) On the Approximate Computation of Multiple Integrals, Vestnik Moscow State University 4, 3–18.
  2. Centeno, V., Georgiev, I. R., Mihova, V., Pavlov, V. (2019, October). Price forecasting and risk portfolio optimization. In AIP Conference Proceedings (Vol. 2164, No. 1, p. 060006). AIP Publishing LLC.
  3. Dimov I., Monte Carlo Methods for Applied Scientists, New Jersey, London, Singapore, World Scientific, 2008, 291p.
  4. F. Y. Kuo and D. Nuyens (2016) Application of quasi-Monte Carlo methods to elliptic PDEs with random diffusion coefficients - a survey of analysis and implementation, Foundations of Computational Mathematics 16(6), 1631–1696.
  5. Lin S., “Algebraic Methods for Evaluating Integrals in Bayesian Statistics,” Ph.D. dissertation, UC Berkeley, May 2011.
  6. Lin, S., Sturmfels B., Xu Z.: Marginal Likelihood Integrals for Mixtures of Independence Models, Journal of Machine Learning Research, Vol. 10, pp. 1611-1631, 2009.
  7. Minasny B., McBratney B.: A conditioned Latin hypercube method for sampling in the presence of ancillary information Journal Computers and Geosciences archive, Volume 32 Issue 9, November, 2006, Pages 1378-1388.
  8. S.H. Paskov, Computing high dimensional integrals with applications to finance, Technical report CUCS-023-94, Columbia University (1994).
  9. Song, J., Zhao, S., Ermon, S., A-nice-mc: Adversarial training for mcmc. In Advances in Neural Information Processing Systems, pp. 5140-5150, 2017.
  10. Wang Y., Hickernell F., An historical overview of lattice point sets, 2002.
  11. Watanabe S., Algebraic analysis for nonidentifiable learning machines. NeuralComput.(13), pp. 899—933, April 2001.