Home / Regular Issue / JST Vol. 29 (1) Jan. 2021 / JST-2143-2020

 

Logic Learning in Adaline Neural Network

Nadia Athirah Norani, Mohd Shareduwan Mohd Kasihmuddin, Mohd. Asyraf Mansor and Noor Saifurina Nana Khurizan

Pertanika Journal of Science & Technology, Volume 29, Issue 1, January 2021

DOI: https://doi.org/10.47836/pjst.29.1.16

Published: 22 January 2021

In this paper, Adaline Neural Network (ADNN) has been explored to simulate the actual signal processing between input and output. One of the drawback of the conventional ADNN is the use of the non-systematic rule that defines the learning of the network. This research incorporates logic programming that consists of various prominent logical representation. These logical rules will be a symbolic rule that defines the learning mechanism of ADNN. All the mentioned logical rule are tested with different learning rate that leads to minimization of the Mean Square Error (MSE). This paper uncovered the best logical rule that could be governed in ADNN with the lowest MSE value. The thorough comparison of the performance of the ADNN was discussed based on the performance MSE. The outcome obtained from this paper will be beneficial in various field of knowledge that requires immense data processing effort such as in engineering, healthcare, marketing, and business.

  • Abdullah, W. A. T. W. (1992). Logic programming on a neural network. International Journal of Intelligent Systems, 7(6), 513-519. doi: https://doi.org/10.1002/int.4550070604

  • Alzaeemi, S., Mansor, M. A., Kasihmuddin, M. S. M., Sathasivam, S., & Mamat, M. (2020). Radial basis function neural network for 2 satisfiability programming. Indonesian Journal of Electrical Engineering and Computer Science, 18(1), 459-469. doi: 10.11591/ijeecs.v18.i1.pp459-469

  • Asha, P., & Anuja, J. (2010). Benchmarking the competency mapping process with special reference to manufacturing industry. Retrieved November 25, 2019, from https://www.asmedu.org/incon/INCON6HRRP.pdf

  • Chen, B., Zhu, Y., & Hu, J. (2010). Mean-square convergence analysis of ADALINE training with minimum error entropy criterion. IEEE Transactions on Neural Networks, 21(7), 1168-1179. doi: 10.1109/TNN.2010.2050212

  • Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. Cambridge, USA: MIT press. doi: https://doi.org/10.4258/hir.2016.22.4.351

  • Hamadneh, N. (2013). Logic programming in radial basis function neural networks (PhD thesis). Universiti Sains Malaysia, Malaysia.

  • Jannati, M., Hosseinian, S. H., Vahidi, B., & Li, G. J. (2016). ADALINE (Adaptive Linear Neuron)-based coordinated control for wind power fluctuations smoothing with reduced BESS (battery energy storage system) capacity. Energy, 101, 1-8. doi: https://doi.org/10.1016/j.energy.2016.01.100

  • Kasihmuddin, M. S. M., Mansor, M. A., & Sathasivam, S. (2016). Bezier curves satisfiability model in enhanced Hopfield network. International Journal of Intelligent Systems and Applications, 8(12), 9-17. doi: 10.5815/ijisa.2016.12.02

  • Kasihmuddin, M. S. M., Mansor, M. A., & Sathasivam, S. (2017a). Hybrid genetic algorithm in the Hopfield network for logic satisfiability problem. Pertanika Journal of Science and Technology, 25(1), 139-152.

  • Kasihmuddin, M. S. M., Mansor, M. A., & Sathasivam, S. (2018). Discrete Hopfield neural network in restricted maximum k-satisfiability logic programming. Sains Malaysiana, 47(6), 1327-1335. doi: http://dx.doi.org/10.17576/jsm-2018-4706-30

  • Kavak, A., Yigit, H., & Ertunc, H. M. (2005). Using ADALINE neural network for performance improvement of smart antennas in TDD wireless communications. IEEE Transactions on Neural Networks, 16(6), 1616-1625. doi: 10.1109/TNN.2005.857947

  • Kho, L. C., Kasihmuddin, M. S. M., Mansor, M., & Sathasivam, S. (2020). Logic mining in league of legends. Pertanika Journal of Science and Technology, 28(1), 211-225.

  • Kowalski, R. (1978). Logic for data description. In Logic and data bases (pp. 77-103). Boston, MA: Springer. doi: https://doi.org/10.1007/978-1-4684-3384-5_4

  • Mansor, M. A., Jamaludin, S. Z. M., Kasihmuddin, M. S. M., Alzaeemi, S. A., Basir, M. F. M., & Sathasivam, S. (2020). Systematic boolean satisfiability programming in radial basis function neural network. Processes, 8(2), 1-16. doi: https://doi.org/10.3390/pr8020214

  • Mansor, M. A., Kasihmuddin, M. S. M., & Sathasivam, S. (2017). Artificial immune system paradigm in the Hopfield network for 3-satisfiability problem. Pertanika Journal of Science and Technology, 25(4), 1173-1188.

  • Negarestani, A., Setayeshi, S., Ghannadi-Maragheh, M., & Akashe, B. (2003). Estimation of the radon concentration in soil related to the environmental parameters by a modified Adaline neural network. Applied Radiation and Isotopes, 58(2), 269-273. doi: https://doi.org/10.1016/S0969-8043(02)00304-4

  • Pajares, G., & Jesús, M. (2001). Local stereovision matching through the ADALINE neural network. Pattern Recognition Letters, 22(14), 1457-1473. doi: https://doi.org/10.1016/S0167-8655(01)00097-6

  • Raschka, S. (2015). Adaptive linear neuron. Retrieved July 13, 2019, from https://rasbt.github.io/mlxtend/user_guide/classifier/Adaline/

  • Riche, J. (2011). Logic in Whitehead’s universal algebra. Logique and Analyse, 214(6), 135-159.

  • Sathasivam, S. (2010). Upgrading logic programming in Hopfield network. Sains Malaysiana, 39(1), 115-118.

  • Sathasivam, S., Mamat, M., Mansor, M., & Kasihmuddin, M. S. M. (2020b). Hybrid discrete Hopfield neural network based modified clonal selection algorithm for VLSI circuit verification. Pertanika Journal of Science and Technology, 28(1), 227-243.

  • Sathasivam, S., Mansor, M. A., Kasihmuddin, M. S. M., & Abubakar, H. (2020a). Election algorithm for random k satisfiability in the Hopfield neural network. Processes, 8(5), 1-19. doi: https://doi.org/10.3390/pr8050568

  • Sharma, P., Raghuvanshi, A., & Pachori, R. (2019). Artificial intelligence and soft computing: Soft computing techniques: Artificial intelligence, neural networks, fuzzy logic and genetic algorithm. Chhattisgarh, India: Educreation Publishing.

  • Sharma, S. (2017). Activation functions in neural networks. Retrieved September 8, 2020, from https://towardsdatascience.com/activation-functions-neural-networks-1cbd9f8d91d6

  • Sivanandam, S. N., & Deepa, S. N. (2006). Introduction to neural networks using Matlab 6.0. New Delhi, India: Tata McGraw-Hill Education.

  • Smith, L. N. (2017, March 24-31). Cyclical learning rates for training neural networks. In 2017 IEEE Winter Conference on Applications of Computer Vision (WACV) (pp. 464-472). Santa Rosa, CA, USA. doi: 10.1109/WACV.2017.58

  • Somogyi, Z., Henderson, F., & Conway, T. (1996). The execution algorithm of mercury, an efficient purely declarative logic programming language. The Journal of Logic Programming, 29(1), 17-64. doi: https://doi.org/10.1016/S0743-1066(96)00068-4

  • Sujith, M., & Padma, S. (2020). Optimization of harmonics with active power filter based on ADALINE neural network. Microprocessors and Microsystems, 73(2020), 1-17. doi: https://doi.org/10.1016/j.micpro.2019.102976

  • Wang, Z. Q., Manry, M. T., & Schiano, L. (2000). LMS learning algorithms: Misconceptions and new results on convergence. IEEE Transactions on Neural Network, 11(1), 47-56. doi: 10.1109/72.822509

  • Widrow, B., & Hoff, M. E. (1960). Adaptive switching circuits. Stanford, California: Stanford University CA Stanford Electronics Labs.

  • Widrow, B., & Hoff, M. E. (1959). Adaptive sampled-data systems - A statistical theory of adaptation. I.R.E. Wescon Convention Record, 4, 96-104.

  • Widrow, B., & Lehr, M. A. (1990). 30 years of adaptive neural networks: Perceptron, madaline, and backpropagation. Proceedings of the IEEE, 78(9), 1415-1442. doi: 10.1109/5.58323

  • Widrow, B., & Stearns, S. D. (1985). Adaptive signal processing. Englewood Cliffs, N.J: Prentice-Hall.

  • Zulkifli, H. (2018). Understanding learning rates and how it improves performance in deep learning. Retrieved July 16, 2019, from https://towardsdatascience.com/understanding-learning-rates-and-howit-improves-performance-in-deep-learning-d0d4059c1c10.

ISSN 0128-7702

e-ISSN 2231-8534

Article ID

JST-2143-2020

Download Full Article PDF

Share this article

Related Articles