Cellular simultaneous recurrent neural network (SRN) has been shown to be a function approximator more powerful than the multilayer perceptron (MLP). This means that the complexity of MLP would be prohibitively large for some problems while SRN could realize the desired mapping with acceptable computational constraints. The speed of training of complex recurrent networks is crucial to their successful application. This work improves the previous results by training the network with extended Kalman filter (EKF). We implemented a generic cellular SRN (CSRN) and applied it for solving two challenging problems: 2-D maze navigation and a subset of the connectedness problem. The speed of convergence has been improved by several orders of magnitude in comparison with the earlier results in the case of maze navigation, and superior generalization has been demonstrated in the case of connectedness. The implications of this improvements are discussed [1].

This code is available for download via the external resources page.

The above work has been extended to perform image affine transformation using CSRN architecture. The details of this work can be found in [2]. Moreover, an improved learning algorithm known as unscented Kalman filter (UKF) has been proposed in [3].

[1] R. Ilin, R. Kozma, and P. Werbos, “Cellular SRN trained by extended Kalman filter shows promise for ADP,” in Proc. IEEE IJCNN, Jul. 2006, pp. 506–510.
[2] J. K. Anderson and K. M. Iftekharuddin, “Learning topological image transforms using cellular simultaneous recurrent networks,” Neural Networks (IJCNN), The 2013 International Joint Conference on, 2013,
pp. 1-9.
[3] L. Vidyaratne, M. Alam, J. K. Anderson, and K. M. Iftekharuddin, “Improved training of cellular SRN using Unscented Kaiman Filtering for ADP,” in Neural Networks (IJCNN), 2014 International Joint Conference on, 2014, pp. 993-1000.