Normality Testing for Vectors on Perceptron Layers


  •   Youmna Shawki Karaki

  •   Halina Kaubasa

  •   Nick Ivanov


Designing optimal topology of network graph is one of the most prevalent issues in neural network applications. Number of hidden layers, number of nodes in layers, activation functions, and other parameters of neural networks must suit the given data set and the prevailing problem. Massive learning datasets prompt a researcher to exploit probability methods in an attempt to find optimal structure of a neural network. Classic Bayesian estimation of network hyperparameters assumes distribution of specific random parameters to be Gaussian. Multivariate Normality Analysis methods are widespread in contemporary applied mathematics. In this article, the normality of probability distribution of vectors on perceptron layers was examined by the Multivariate Normality Test. Ten datasets from University of California, Irvine were selected for the computing experiment. The result of our hypothesis on Gaussian distribution is negative, ensuring that none of the set of vectors passed the criteria of normality.

Keywords: Bayesian optimization, Gaussian distribution, Hyperparameters, Neural Networks


K. Swingler, “Applying neural networks: a practical guide,” London, Academic Press, 1996, p. 303.

J. Bergstra, and Y. Bengio, “Random search for hyperparameter optimization,” The Journal of Machine Learning Research, vol. 13, no. 1, pp. 281-305, 2012.

J. Bergstra, D. Yamins,, and D. Cox, “Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures”, The Journal of Machine Learning Research, v.28, n.1, 2013, pp. 118–123.

J. Snoek, H. Larochelle, R.P. Adams, Practical Bayesian optimization of machine learning algorithms, Proc. of Advances in Neural Information Processing Systems 25, NIPS 3-6 Dec 2012, Lake Tahoe, Nevada, USA, pp. 2960–2968.

J. Lampinen, and A. Vehtari, “Bayesian approach for neural networks - review and case studies,” Neural Networks, vol. 14, no. 3, 2001, pp. 7-24.

D. Maclaurin, D. Duvenaud,, and R.P. Adams, “Gradient-based hyperparameter optimization through reversible learning”, Proc. of 32nd Int. Conf. on Machine Learning, ICML, 6–11 July 2015, Lille, France, , pp. 2113–2122.

D. Orive, G. Sorrosal, C.E. Borges, C. Martin, and A. Alonso-Vicario, “Evolutionary algorithms for hyperparameter tuning on neural networks models”, Proc. of the 26th European Modeling and Simulation Symposium, EMSS, 10-12 Sept. 2014, Bordeaux, France, pp. 402 - 410.

E. Bochinski, T. Senst, and T. Sikora, “Hyperparameter optimization for convolutional neural network committees based on evolutionary algorithms”, Proc. of 2017 IEEE Image Processing Int. Conf., ICIP-2017, 17-20 Sept. 2017, Beijing, China, pp. 3924-3928.

F. Dernoncourt, and J.Y. Lee, “Optimizing neural network hyperparameters with Gaussian processes for dialog act classification”, Proc. of IEEE Spoken Language Technology Workshop, SLT 2016, 13-16 Dec. 2016, San Diego, USA, pp. 406-413.

P. Muguran, “Hyperparameters Optimization in Deep Convolutional Neural Network /Bayesian Approach with Gaussian Process Priors”, arXiv:1712.07233v1, 19 Dec. 2017, p.10.

University California Irvine Machine Learning Data Sets., Retrieved 4 Feb, 2019., Retrieved 7 Oct., 2019., Retrieved 7 Oct., 2019.

H. Sheffe, “The analysis of variance”, New York, John Wiley & Sons, 1999, p. 477.

D.N. Joanes, and C.A. Gill, “Comparing Measures of Sample Skewness and Kurtosis”, The Statistician, vol. 47, Part 1, 1998, pp. 183-189.

K.H. Yuan, P.M. Bentler, and W. Zhang, “The effect of skewness and kurtosis on mean and covariance structure analysis: the univariate case and its multivariate implication”, Sociological Methods & Research, vol. 34, no. 2, 2005, pp. 240–258.

K.V. Mardia, “Measure of multivariate skewness and kurtosis with applications”, Biometrika, 57, 1970, pp. 519-530.

L. Baringhaus, and N. Henze, “Limit distributions for measures of multivariate skewness and kurtosis based on projections”, Journal of Multivariate Analysis, vol. 38, 1991, pp. 51–69.


Download data is not yet available.


How to Cite
Karaki, Y., Kaubasa, H. and Ivanov, N. 2020. Normality Testing for Vectors on Perceptron Layers. European Journal of Engineering and Technology Research. 5, 9 (Sep. 2020), 1085-1088. DOI: