Home            Contact us            FAQs
    
      Journal Home      |      Aim & Scope     |     Author(s) Information      |      Editorial Board      |      MSP Download Statistics

     Research Journal of Applied Sciences, Engineering and Technology


CAST: A constant Adaptive Skipping Training Algorithm for Improving the Learning Rate of Multilayer Feedforward Neural Networks

R. Manjula Devi and S. Kuppuswami
Faculty of Computer Science and Engineering, Kongu Engineering College, Perundurai, Erode
Research Journal of Applied Sciences, Engineering and Technology  2016  8:790-812
http://dx.doi.org/10.19026/rjaset.12.2780  |  © The Author(s) 2016
Received: November 11, 2013  |  Accepted: November 29, 2013  |  Published: April 15, 2016

Abstract

Multilayer Feedforward Neural Network (MFNN) has been administered widely for solving a wide range of supervised pattern recognition tasks. The major problem in the MFNN training phase is its long training time especially when it is trained on very huge training datasets. In this accordance, an enhanced training algorithm called Constant Adaptive Skipping Training (CAST) Algorithm is proposed in this research paper which intensifies on reducing the training time of the MFNN through stochastic manifestation of training datasets. The stochastic manifestation is accomplished by partitioning the training dataset into two completely separate classes, classified and misclassified class, based on the comparison result of the calculated error measure with the threshold value. Only the input samples in the misclassified class are exhibited to the MFNN for training in the next epoch, whereas the correctly classified class is skipped constantly which dynamically reducing the number of training input samples exhibited at every single epoch. Thus decreasing the size of the training dataset constantly can reduce the total training time, thereby speeding up the training process. This CAST algorithm can be merged with any training algorithms used for supervised task, can be used to train the dataset with any number of patterns and also it is very simple to implement. The evaluation of the proposed CAST algorithm is demonstrated effectively using the benchmark datasets - Iris, Waveform, Heart Disease and Breast Cancer for different learning rate. Simulation study proved that CAST training algorithm results in faster training than LAST and standard BPN algorithm.

Keywords:

Adaptive skipping, learning rate, MFNN, neural network, training algorithm, training speed,


References

  1. Ampazis, N. and S.J. Perantonis, 2002. Two highly efficient second-order algorithms for training feedforward networks. IEEE T. Neural Networ., 13(5): 1064-1074.
    CrossRef    PMid:18244504    
  2. Asuncion, A. and D.J. Newman, 2007. UCI Machine Learning Repository. School of Information and Computer Science, University of California, Irvine, CA. Retrieved form: http://www.ics.uci.edu/~mlearn/.
    Direct Link
  3. Behera, L., S. Kumar and A. Patnaik, 2006. On adaptive learning rate that guarantees convergence in feedforward networks. IEEE T. Neural Networ., 17(5): 1116-1125.
    CrossRef    PMid:17001974    
  4. Devi, R.M., S. Kuppuswami and R.C. Suganthe, 2013. Fast linear adaptive skipping training algorithm for training artificial neural network. Math. Probl. Eng., 2013(2013): 9.
  5. Hornik, K., M. Stinchcombe and H. White, 1989. Multilayer feedforward networks are universal approximators. Neural Networks, 2(5): 359-366.
    CrossRef    
  6. Huang, G.B., Y.Q. Chen and H.A. Babri, 2000. Classification ability of single hidden layer feedforward neural networks. IEEE T. Neural Networ., 11(3): 799-801.
    CrossRef    PMid:18249806    
  7. Mehra, P. and B.W. Wah, 1992. Artificial Neural Networks: Concepts and Theory. 1st Edn., IEEE Computer Society Press, Los Alamitos, Calif, pp: 667.
  8. Nguyen, D. and B. Widrow, 1990. Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights. Proceeding of the IJCNN International Joint Conference on Neural Networks. San Diego, CA, USA, 3: 21-26.
    CrossRef    
  9. Plagianakos, V.P., D.G. Sotiropoulos and M.N. Vrahatis, 1998. A nonmonotone backpropagation training method for neural networks. Department of Mathematics, University of Patras, Technical Report No. TR98-04.
  10. Razavi, S. and B.A. Tolson, 2011. A new formulation for feedforward neural networks. IEEE T. Neural Networ., 22(10): 1588-1598.
    CrossRef    PMid:21859600    
  11. Shao, H. and G. Zheng, 2009. A new BP algorithm with adaptive momentum for FNNs training. Proceeding of the WRI Global Congress on Intelligent Systems. Xiamen, China, 4: 16-20.
    CrossRef    
  12. Varnava, T.M. and A.J. Meade Jr., 2011. An initialization method for feedforward artificial neural networks using polynomial bases. Adv. Adaptive Data Anal., 3: 385-400.
    CrossRef    
  13. Wilamowski, B.M. and H. Yu, 2010. Improved computation for levenberg–Marquardt training. IEEE T. Neural Networ., 21: 930-937.
    CrossRef    PMid:20409991    
  14. Yu, H. and B.M. Wilamowski, 2012. Neural Network Training with Second Order Algorithms. In: Hippe, Z.S. et al. (Eds.), Human-Computer Systems Interaction: Backgrounds and Applications 2. Advances in Intelligent and Soft Computing, Springer-Verlag, Berlin, Heidelberg, 99: 463-476.
    CrossRef    

Competing interests

The authors have no competing interests.

Open Access Policy

This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Copyright

The authors have no competing interests.

ISSN (Online):  2040-7467
ISSN (Print):   2040-7459
Submit Manuscript
   Information
   Sales & Services
Home   |  Contact us   |  About us   |  Privacy Policy
Copyright © 2024. MAXWELL Scientific Publication Corp., All rights reserved