Modern fuzzy min max neural networks for pattern classification
In the recent years, the world has demonstrated an increasing interest in soft computing techniques to deal with complex real world problems. Neural network and fuzzy logic are considered to be one of the most popular soft computing techniques that applied in pattern classification domain. To build...
Saved in:
Main Author: | |
---|---|
Format: | Thesis |
Language: | English |
Published: |
2019
|
Subjects: | |
Online Access: | http://umpir.ump.edu.my/id/eprint/30009/1/Modern%20fuzzy%20min%20max%20neural%20networks%20for%20pattern%20classification.wm.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In the recent years, the world has demonstrated an increasing interest in soft computing techniques to deal with complex real world problems. Neural network and fuzzy logic are considered to be one of the most popular soft computing techniques that applied in pattern classification domain. To build an efficient classifier model, researchers have introduced hybrid models that combine both fuzzy logic and artificial neural networks. Among these algorithms, Fuzzy Min Max (FMM) neural network algorithm has been proven to be one of the premier neural networks for undertaking the pattern classification problems. Although the FMM has many important features with the ability to provide online learning process and can handle the forgetting problem, it suffers from a number of limitations, especially in its learning process i.e., expansion process, overlapping test process, and contraction process. Therefore, Modern Fuzzy Min Max neural network is introduced with aim of overcoming the specified limitations of the original FMM. The MDFMM introduces a number of contributions in addition to modify the original FMM expansion activation function by replace it with that from the Enhanced Fuzzy Min Max (EFMM) to eliminate the overlapping cases. First, this study proposed a new expansion technique to overcome both overlap leniency and irregularity of hyperbox expansion problems, as a result, reducing the number of contraction processes. Secondly, proposing a new overlapping test formula that simplify the FMM/EFMM overlap test process with perfectly covers all the possible overlapped cases. Thirdly, proposing a new contraction process that provides more accurate hyperboxes description and avoid data distortion problem (hyperbox information losses). Fourthly, proposing a new prediction strategy in the test phase by integrating the distance equation with membership function in order to solve the randomization decision making problem, which helps to provide more accurate prediction when input sample has same fitness values with different classes. To overcome the network structure complexity of MDFMM, a further improvement is introduced by improving the selection of the winning hyperbox during the expansion process using the k-nearest neighbours algorithm (MDFMM-Kn). The performance of MDFMM and MDFMM-Kn was evaluated using different UCI benchmark datasets and 2D artificial intelligence dataset. Furthermore, three statistical analysis techniques, namely, bootstrap method, k-fold cross-validation and the Wilcoxon signed-rank test, were utilized to statistically quantify the performances. From the empirical evaluation, the proposed MDFMM is better than the recent existing model modified FMM network (MFMMN) in terms of accuracy at an improvement percentage of 35.42%. Furthermore, the average performance of the MDFMM-Kn against the FMM and MDFMM models is better than that of the existing techniques in terms of complexity at a percentage of 62%. |
---|