General Information
    • ISSN: 1793-8201 (Print), 2972-4511 (Online)
    • Abbreviated Title: Int. J. Comput. Theory Eng.
    • Frequency: Quarterly
    • DOI: 10.7763/IJCTE
    • Editor-in-Chief: Prof. Mehmet Sahinoglu
    • Associate Editor-in-Chief: Assoc. Prof. Alberto Arteta, Assoc. Prof. Engin Maşazade
    • Managing Editor: Ms. Mia Hu
    • Abstracting/Indexing: Scopus (Since 2022), INSPEC (IET), CNKI,  Google Scholar, EBSCO, etc.
    • Average Days from Submission to Acceptance: 192 days
    • E-mail: ijcte@iacsitp.com
    • Journal Metrics:

Editor-in-chief
Prof. Mehmet Sahinoglu
Computer Science Department, Troy University, USA
I'm happy to take on the position of editor in chief of IJCTE. We encourage authors to submit papers concerning any branch of computer theory and engineering.

IJCTE 2013 Vol.5(3): 454-459 ISSN: 1793-8201
DOI: 10.7763/IJCTE.2013.V5.729

Kernel Recursive Least Squares for the CMAC Neural Network

C. W. Laufer and G. Coghill

Abstract—The Cerebellar Model Articulation Controller (CMAC) neural network is an associative memory that is biologically inspired by the cerebellum, which is found in the brains of animals. The standard CMAC uses the least mean squares algorithm to train the weights. Recently, the recursive least squares algorithm was proposed as a superior algorithm for training the CMAC online as it can converge in one epoch, and does not require tuning of a learning rate. However, the RLS algorithms computational speed is dependant on the number of weights required by the CMAC which is often large and thus can be very computationally inefficient. Recently also, the use of kernel methods in the CMAC was proposed to reduce memory usage and improve modeling capabilities. In this paper the Kernel Recursive Least Squares (KRLS) algorithm is applied to the CMAC. Due to the kernel method, the computational complexity of the CMAC becomes dependant on the number of unique training data, which can be significantly less than the weights required by non-kernel CMACs. Additionally, online sparsification techniques are applied to further improve computational speed.

Index Terms—CMAC, kernel recursive least squares.

The authors are with the Department of Electrical and Electronic Engineering, University of Auckland, Auckland, New Zealand (e-mail: clau070@aucklanduni.ac.nz, g.coghill@auckland.ac.nz)

[PDF]

Cite:C. W. Laufer and G. Coghill, "Kernel Recursive Least Squares for the CMAC Neural Network," International Journal of Computer Theory and Engineering vol. 5, no. 3, pp. 454-459, 2013.


Copyright © 2008-2024. International Association of Computer Science and Information Technology. All rights reserved.