Abstract
A novel online learning approach for neuro-fuzzy models is proposed in this paper. Unlike most of the previous online methods which use spherical clusters to define validity region of neurons, the proposed learning method is based on a recursive extension of Gath–Geva clustering algorithm, which is capable of constructing elliptical clusters as well. Eliminating the constraint of spherical clusters by considering general structures for covariance matrices, empowers the proposed evolving neuro-fuzzy model (ENFM) to capture more sophisticated behaviors with less modeling error as well as fewer number of neurons. The proposed recursive clustering method has the ability to cluster data streams using online identification of number of required clusters and recursive estimation of cluster parameters. A merging strategy is also proposed to merge similar clusters which consequently hinders the model from having excessive number of neurons with similar behaviors. Applicability of ENFM is also investigated in modeling a time varying heat exchanger system and prediction of Mackey–Glass and sunspot numbers time series. Simulation results indicate better performance of the proposed model as compared with that of several well-known modeling and prediction methods.
Similar content being viewed by others
References
Angelov PP, Filev DP (2004) An approach to online identification of Takagi–Sugeno fuzzy models. IEEE Trans Syst Man Cybern B 34(1):484–498
Angelov PP, Filev DP (2005) Simpl_eTS: a simplified method for learning evolving Takagi–Sugeno fuzzy models. In: Proceedings of IEEE Int Conf on Fuzzy Syst, pp 1068–1073
Angelov PP, Zhou X (2006) Evolving fuzzy systems from data streams in real-time. In: Proceedings of IEEE Int Symp on Evolving Fuzzy Systems, pp 29–35
Beringer J, Hullermeier E (2008) Online clustering of parallel data streams. Data Knowl Eng 58:180–204
Bittanti S, Piroddi L (1996) Nonlinear identification and control of a heat exchanger: a neural network approach. J Franklin Inst 334(1):135–153
Chiu SL (1994) Fuzzy model identification based on cluster estimation. J Intell Fuzzy Syst 2:267–278
Crespo F, Weber R (2005) A methodology for dynamic data mining based on fuzzy clustering. Fuzzy Sets Syst 150:267–284
DaISy (2009) A database for identification of systems. http://www.esat.kuleuven.be/sista/daisy/
Deng D, Kasabov N (2000) Evolving self-organizing maps for online learning, data analysis and modeling. In: Proceedings of IJCNN 2000 neural networks, neural computation: new challenges perspectives new millennium, Como, Italy, pp 3–8
Fritzke B (1995) A growing neural gas network learns topologies. Adv Neural Inf Process Syst 7:625–632
Gath I, Geva AB (1989) Unsupervised optimal fuzzy clustering. IEEE Trans Pattern Anal Mach Intell 11(7):773–780
Georgieva O, Klawonn F (2008) Dynamic data assigning assessment clustering of streaming data. Appl Soft Comput 8:1305–1313
Gholipour A, Araabi BN, Lucas C (2006) Predicting chaotic time series using neural and neurofuzzy models: a comparison study. Neural Process Lett 24:217–239
Gustafson DE, Kessel WC (1978) Fuzzy clustering with a fuzzy covariance matrix. In: Proceedings of IEEE Conf on decision and control, San Diego, pp 761–766
Hoppner F, Klawonn F et al (1999) Fuzzy cluster analysis. Wiley, London
Jang JS (1993) ANFIS: adaptive-network-based fuzzy inference system. IEEE Trans Syst Man Cybern 23(3):665–685
Kasabov N (1998) Evolving fuzzy neural networks-algorithms, applications and biological motivation. In: Yamakawa T, Matsumoto G (eds) Methodologies for the conception, design and applications of soft computing. World Scientific, Singapore, pp 271–274
Kasabov N, Song Q (2002) DENFIS: dynamic evolving neural-fuzzy inference system and its application for time-series prediction. IEEE Trans Fuzzy Syst 10(2):144–154
Klim CH, Kim MS, Lee JJ (2007) Incremental hyperplane-based fuzzy clustering for system modeling. In: Proceedings of 33rd Conf of IEEE Industrial Electronics Society, Taipei, Taiwan, pp 614–619
Lughofer E, Klement EP (2005) FLEXFIS: a variant for incremental learning of Takagi–Sugeno fuzzy systems. In: Proceedings of IEEE Int Conf on fuzzy systems, pp 915–920
Lughofer ED (2008) FLEXFIS: A Robust Incremental Learning Approach for Evolving Takagi–Sugeno fuzzy models. IEEE Trans Fuzzy Syst 16(6):1393–1410
Mackey M, Glass L (1977) Oscillation and chaos in psychological control systems. Science 197:281–287
Martinez B, Herrera F, Fernandez J, Marichal E (2008) An incremental clustering method and its application in online fuzzy modeling. Stud Fuzziness Soft Comput 224:163–178
McNish AG, Lincoln JV (1949) Prediction of sunspot numbers. Trans Am Geophys Union 30:673
Mirmomeni M, Lucas C, Araabi BN (2009) Introducing recursive learning algorithm for system identification of nonlinear time varying processes. In: Proceedings of 17th Mediterranean Conf. on Control and Automation, Thessaloniki, Greece, pp 736–741
Nelles O (1996) Local linear model tree for on-line identification of time invariant nonlinear dynamic systems. In: Proceedings of Int Conf on Artificial Neural Networks (ICANN), Bochum, Germany, pp 115–120
Nelles O (2001) Nonlinear system identification. Springer, London
Palit AK, Popovic D (2005) Computational intelligence in time series forecasting: theory and engineering applications. Advances in industrial control. Springer, London
Pedrycz W (2008) A dynamic data granulation through adjustable fuzzy clustering. Pattern Recognit Lett 29:2059–2066
Platt J (1991) A resource allocation network for function interpolation. Neural Comput 3:213–225
Sello S (2001) Solar cycle forecasting: a nonlinear dynamic approach. Astron Astrophys 377(1):312–320
SIDC (2003) Solar influences data analysis center. http://sidc.oma.be/index.php3
Takagi T, Sugeno M (1985) Fuzzy identification of systems and its application to modeling and control. IEEE Trans Syst Man Cybern 15:116–132
Zilouchian A, Jamshidi M (eds) (2001) Intelligent control systems using soft computing methodologies. CRC Press, Boca Raton
Author information
Authors and Affiliations
Corresponding author
Appendices
Appendix 1: Recursive tuning of cluster parameters
In the offline method of GG clustering, center and covariance matrices are estimated by a summation over all data points. But in online method in order to increase the speed of the algorithm it is required to update them based on the new data point. Considering that a previous estimation for center of the ith cluster based on previous Q data points is available, after coming another data point the new estimation can be computed as follows.
which can alternatively be written similar to the following equations.
where \(\tilde{x}\) and \(\tilde{u}_i\) are the new data point and its membership to the ith cluster. Doing further simplifications on (26) we have
which is equivalent to (9).
The same procedure can be applied for recursive tuning of the covariance matrix. Considering (6), after arriving the new data point, the covariance matrix is computed as follows.
Substituting (9) into \((x_j-\mu_i^+)(x_j-\mu_i^+)^T\) and by defining \(\beta={\frac{\tilde{u}_i^m}{N_i^-+\tilde{u}_i^m}},\) the following equations can be derived.
or
By substituting (30) into the summation term in the numerator of (28) and considering the fact that \(\sum_{j=1}^Q{(\tilde{x}-\mu_i^-)}=0,\) we have
Furthermore \((\tilde{x}-\mu_i^+)(\tilde{x}-\mu_i^+)^T\) in (28) can be simplified to the following equation by substitution of \(\mu_i^+\) from (9).
Thus by substituting (31) and (32) into (28) and replacement of β, the following formula is derived for updating the covariance matrix which is equivalent to (10).
Appendix 2: Merging similar clusters
In this section formulas for estimating parameters of the cluster 3 which is created as a result of merging clusters 1 and 2 are derived. By merging two clusters it is assumed that other clusters are not changed and the membership degree of data points to the new cluster is also assumed to be the sum of membership degrees to the two previous ones. So the center of the new cluster can be estimated as follows.
But in order to derive a closed form equation based on μ1 and μ2, the approximation proposed in (13) seems to be necessary. Considering this assumption (34) can be changed to the following equation which is the same as (15).
Furthermore covariance matrix of the new cluster can also be estimated based on parameters of the previous clusters.
By substituting (35) into (36) and factorization of the \({\frac{1} {N_1+N_2}}\) we have
By further simplification of (37) and considering the definition of center and covariance matrix, the following formula is resulted.
Different terms in (38) can be simplified further as follows.
Similarly we have
Furthermore,
and
Considering (39)–(42) and substituting them into (38) the final formula for computing covariance matrix of the new cluster is derived.
Rights and permissions
About this article
Cite this article
Soleimani-B., H., Lucas, C. & Araabi, B.N. Recursive Gath–Geva clustering as a basis for evolving neuro-fuzzy modeling. Evolving Systems 1, 59–71 (2010). https://doi.org/10.1007/s12530-010-9006-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12530-010-9006-x