Applying Bayesian Decision Theory in RBF Neural Network to Improve Network precision in Data Classification
AbstractOne of the common tools used for classification of data is RBF neural network. The lack of connectivity of features in each layer in the structure of neural networks such as the RBF neural network causes the values of the features to not be multiplied, and the action and dependency of the values of a feature on other features not to be considered in the classification or regression process. The most important reason for the lack of connectivity among the features can be considered as the problem of learning weights. This research tries to use the multiplication of values of event probability of features to improve the efficiency of data classification in the RBF neural network based on the reasons mentioned above through classification style in the Bayesian decision theory. Moreover, the linear weight coefficients at the final layer of the RBF neural network are used to determine the importance of the feature event in the final decision. This research tries to use the capabilities of the RBF neural network in assigning event probabilities to the values of input features based on data centers. The presence of linear weight in the final layer makes learning weights improve the classification. Empirical experiments show good results.
Feng Jia, Yaguo Lei, Jing Lin, Xin Zhou, Na Lu, Deep neural networks:A promising tool for fault characteristic mining and intelligent diagnosis of rotating machinery with massive data, Mechanical Systems and Signal Processing , vol(72), pp.303–315,2016.
H. J. Kappen and F. B. Rodrłguez, Efficient Learning in Boltzmann Machines Using Linear Response Theory, Neural Computation,vol (10), Issue( 5), 1998.
George E. Dahl, MarcAurelio Ranzato, Abdel-rahman Mohamed, and Geoffrey Hinton,Phone Recognition with the Mean-Covariance Restricted Boltzmann Machine , Advances in Neural Information Processing Systems(NIPS), pp.469-488, 2010.
Hugo Larochelle, Michael Mandel, Razvan Pascanu, Yoshua Bengio , Learning Algorithms for the Classification Restricted Boltzmann Machine , Journal of Machine Learning Research ,vol(13), pp.643?669, 2012.
Jeffrey M. Beck, Wei Ji Ma, Roozbeh Kiani, Tim Hanks, Anne K. Churchland, Jamie Roitman, Michael N. Shadlen, Peter E. Latham, Alexandre Pouget, Probabilistic Population Codes for Bayesian Decision Making , Neuron, Vol (60), Issue (6), PP. 946-949, 26 December 2008.
Buscema, Massimo, Substance use & misuse , Substance use misuse 33, no. 2, pp:233-270, 1998.
Bengio, Yoshua, Ian J. Goodfellow, and Aaron Courville, Deep learning , Nature 521, no. 7553, pp:436-444, 2015.
Mandt, Stephan, Matthew D. Hoffman, and David M. Blei, Stochastic gradient descent as approximate Bayesian inference., The Journal of Machine Learning Research 18, no. 1, pp:4873-4907, 2017.
 Guang-Bin Huang , P. Saratchandran, N. Sundararajan, A generalized growing and pruning RBF (GGAPRBF) neural network for function approximation, IEEE Transactions on Neural Networks, vol (16), Issue(1), 2005.
R.N. Mahanty, P.B. Dutta Gupta , Application of RBF neural network to fault classification and location in transmission lines , IEE Proceedings - Generation, Transmission and Distribution, vol(151), Issue(2), pp. 201 C 212, 2004.
Shree, SR Bhagya, and H. S. Sheshadri, Diagnosis of Alzheimer’s disease using Naive Bayesian Classifier , Neural Computing and Applications 29, no. 1,pp:123-132 ,2018.
Suresh, K and Dillibabu, R, Designing a Machine Learning Based Software Risk Assessment Model Using Na?ve Bayes Algorithm, 2018.
Goodfellow, Ian, Yoshua Bengio, Aaron Courville, and Yoshua Bengio, Deep learning, Vol. 1. Cambridge: MIT press, 2016.
Rezazadeh, Nader , A modification of the initial weights in the restricted Boltzman machine to reduce training error of the deep belief neural network , vol.15, No.7 , pp.1-6, 2017
Rezazadeh, Nader , Initialization of Weights in Deep Belief Neural Network Based on Standard Deviation of Feature Values in Training Data Vectors , Vol(6), Issue(6), pp:708-715, 2017.
Salakhutdinov, Ruslan, and Geoffrey Hinton, Deep boltzmann machines, In Artificial Intelligence and Statistics, pp: 448-455. 2009.
Martin T Hagan T, Howard B. Demuth, and Mark H. Beale, Neural network design , Boston: Pws Pub, Vol(20), 1996.
Glenn Shafer, Jeff Barnet , Wei Ji Ma, Roozbeh Kiani, Tim Hanks, Anne K. Churchland, Jamie Roitman, Michael N. Shadlen, Peter E. Latham, and Alexandre Pouget , Probabilistic population codes for Bayesian decision making, Neuron, Vol(60), Issue(6),pp:1142-1152, 2008.
Spiegel, Murray R., John J. Schiller, R. Alu Srinivasan, and Mike LeVan, Probability and statistics., Vol. 2. New York: Mcgraw-hill,2009.
Nelson, Randolph. Probability, stochastic processes, and queueing theory: the mathematics of computer performance modeling. Springer Science Business Media, 2013.
Nitish Srivastava, Elman Mansimov, Ruslan Salakhudinov, Unsupervised learning of video representations using lstms , In International Conference on Machine Learning, pp: 843-852, 2015.
Sangjae Lee, Wu Sung Choi. A multi-industry bankruptcy prediction model using back-propagation neural network and multivariate discriminant analysis , Expert Systems with Applications Vol(40), Issue(8), pp:2941-2946, 2013.
Dai, Jianhua, Wentao Wang, Qing Xu, and Haowei Tian. Uncertainty measurement for interval-valued decision systems based on extended conditional entropy , Knowledge-Based Systems, Vol( 27),pp: 443-450, 2012.
Data set Del or no Deal, https://dataverse.harvard.edu/dataset.xhtml persistentId=hdl%3A1902.1/13633, [On Line Available]
Dataset Student Performance, https://archive.ics.uci.edu/ml/datasets/Student+Performance,[On Line Available]
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).