Small Dataset Learning In Prediction Model Using Box-Whisker Data Transformation
There are several data mining tasks such as classification, clustering, prediction, summarization and others. Among them, a prediction task is widely applied in many real applications such as in manufacturing, medical, business and mainly for developing prediction model. However, to build a robust p...
محفوظ في:
المؤلف الرئيسي: | |
---|---|
التنسيق: | أطروحة |
اللغة: | English English |
منشور في: |
2020
|
الموضوعات: | |
الوصول للمادة أونلاين: | http://eprints.utem.edu.my/id/eprint/25379/1/Small%20Dataset%20Learning%20In%20Prediction%20Model%20Using%20Box-Whisker%20Data%20Transformation.pdf http://eprints.utem.edu.my/id/eprint/25379/2/Small%20Dataset%20Learning%20In%20Prediction%20Model%20Using%20Box-Whisker%20Data%20Transformation.pdf |
الوسوم: |
إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
|
id |
my-utem-ep.25379 |
---|---|
record_format |
uketd_dc |
institution |
Universiti Teknikal Malaysia Melaka |
collection |
UTeM Repository |
language |
English English |
advisor |
Draman @ Muda, Azah Kamilah |
topic |
QA Mathematics QA76 Computer software |
spellingShingle |
QA Mathematics QA76 Computer software Lateh, Masitah bdul Small Dataset Learning In Prediction Model Using Box-Whisker Data Transformation |
description |
There are several data mining tasks such as classification, clustering, prediction, summarization and others. Among them, a prediction task is widely applied in many real applications such as in manufacturing, medical, business and mainly for developing prediction model. However, to build a robust prediction model, the learning process from the training set are advised to have many samples. Otherwise, learning from small sample sizes might cause prediction task produced an imprecise model. However, to enlarge a sample size and ensure sufficient learning is sometimes difficult or expensive in certain situations. Thus, the information gained from small samples size are deficient. The main reason why a small sample size has problem in extracting the valuable information is that, the information gaps is exist. These gaps should be filled with observations in a complete dataset. However, these observations are not available. This situation has caused most of the learning tools are difficult to perform the prediction task. This is due to a small samples size will not provide sufficient information in the learning process which will lead to incorrect result. From the previous studies, there are solutions to improve learning accuracy and predictive capability where some artificial data will be added to the system using artificial data generation approach. Hence, the aims of this study are proposing an algorithm of hybrid to generate artificial samples adopts Small Johnson Data Transformation and Box-Whisker Plot which is introduced in previous studies. The proposed algorithm named as Box-Whisker Data Transformation considered all samples contain in a MLCC dataset in order to generate artificial samples. This study also investigates the effectiveness of employing the artificial data generation approach into a prediction model. Initially, the quantiles of raw samples are determine using Box-whisker Plot technique. Subsequently, the Small Johnson Data Transformation is employed to transformed raw samples to a Normal Distribution. Next, samples are generated from Normal Distribution. To test the effectiveness of the proposed algorithm, the real and generated samples is added to training phase to build a prediction model using M5 Model Tree. The results of this study are sample quantiles from reasonable range are generated. Not only that, using all samples available in a dataset as a training samples caused the properties of original pattern behaviors is retained. Besides, the effectivess of the learning performance of prediction model are proved when the number of artificial samples are increased, the average of the mean absolute Percentage Error (AvgMAPE) results of a M5 Model Tree are decreased. This reveals that the training size effect the accuracy of prediction models when the sample size is small. |
format |
Thesis |
qualification_name |
Doctor of Philosophy (PhD.) |
qualification_level |
Master's degree |
author |
Lateh, Masitah bdul |
author_facet |
Lateh, Masitah bdul |
author_sort |
Lateh, Masitah bdul |
title |
Small Dataset Learning In Prediction Model Using Box-Whisker Data Transformation |
title_short |
Small Dataset Learning In Prediction Model Using Box-Whisker Data Transformation |
title_full |
Small Dataset Learning In Prediction Model Using Box-Whisker Data Transformation |
title_fullStr |
Small Dataset Learning In Prediction Model Using Box-Whisker Data Transformation |
title_full_unstemmed |
Small Dataset Learning In Prediction Model Using Box-Whisker Data Transformation |
title_sort |
small dataset learning in prediction model using box-whisker data transformation |
granting_institution |
Universiti Teknikal Malaysia Melaka |
granting_department |
Faculty of Information and Communication Technology |
publishDate |
2020 |
url |
http://eprints.utem.edu.my/id/eprint/25379/1/Small%20Dataset%20Learning%20In%20Prediction%20Model%20Using%20Box-Whisker%20Data%20Transformation.pdf http://eprints.utem.edu.my/id/eprint/25379/2/Small%20Dataset%20Learning%20In%20Prediction%20Model%20Using%20Box-Whisker%20Data%20Transformation.pdf |
_version_ |
1747834113366163456 |
spelling |
my-utem-ep.253792021-10-27T16:13:54Z Small Dataset Learning In Prediction Model Using Box-Whisker Data Transformation 2020 Lateh, Masitah bdul QA Mathematics QA76 Computer software There are several data mining tasks such as classification, clustering, prediction, summarization and others. Among them, a prediction task is widely applied in many real applications such as in manufacturing, medical, business and mainly for developing prediction model. However, to build a robust prediction model, the learning process from the training set are advised to have many samples. Otherwise, learning from small sample sizes might cause prediction task produced an imprecise model. However, to enlarge a sample size and ensure sufficient learning is sometimes difficult or expensive in certain situations. Thus, the information gained from small samples size are deficient. The main reason why a small sample size has problem in extracting the valuable information is that, the information gaps is exist. These gaps should be filled with observations in a complete dataset. However, these observations are not available. This situation has caused most of the learning tools are difficult to perform the prediction task. This is due to a small samples size will not provide sufficient information in the learning process which will lead to incorrect result. From the previous studies, there are solutions to improve learning accuracy and predictive capability where some artificial data will be added to the system using artificial data generation approach. Hence, the aims of this study are proposing an algorithm of hybrid to generate artificial samples adopts Small Johnson Data Transformation and Box-Whisker Plot which is introduced in previous studies. The proposed algorithm named as Box-Whisker Data Transformation considered all samples contain in a MLCC dataset in order to generate artificial samples. This study also investigates the effectiveness of employing the artificial data generation approach into a prediction model. Initially, the quantiles of raw samples are determine using Box-whisker Plot technique. Subsequently, the Small Johnson Data Transformation is employed to transformed raw samples to a Normal Distribution. Next, samples are generated from Normal Distribution. To test the effectiveness of the proposed algorithm, the real and generated samples is added to training phase to build a prediction model using M5 Model Tree. The results of this study are sample quantiles from reasonable range are generated. Not only that, using all samples available in a dataset as a training samples caused the properties of original pattern behaviors is retained. Besides, the effectivess of the learning performance of prediction model are proved when the number of artificial samples are increased, the average of the mean absolute Percentage Error (AvgMAPE) results of a M5 Model Tree are decreased. This reveals that the training size effect the accuracy of prediction models when the sample size is small. 2020 Thesis http://eprints.utem.edu.my/id/eprint/25379/ http://eprints.utem.edu.my/id/eprint/25379/1/Small%20Dataset%20Learning%20In%20Prediction%20Model%20Using%20Box-Whisker%20Data%20Transformation.pdf text en public http://eprints.utem.edu.my/id/eprint/25379/2/Small%20Dataset%20Learning%20In%20Prediction%20Model%20Using%20Box-Whisker%20Data%20Transformation.pdf text en validuser https://plh.utem.edu.my/cgi-bin/koha/opac-detail.pl?biblionumber=119720 phd masters Universiti Teknikal Malaysia Melaka Faculty of Information and Communication Technology Draman @ Muda, Azah Kamilah 1. Ade, M. R. R., and Deshmukh, D. P. R., 2013. Methods for Incremental Learning: A Survey. International Journal of Data Mining and Knowledge Management Process, 3(4), pp. 119-125. 2. Amiri, M., Tavakolipour, H., and Gharehyakheh, S., 2014. Modeling of Melon Drying by Application of Microwave using Mamdani Fuzzy Inference System. European Journal of Experimental Biology, 4(1), pp. 44-52. 3. Andonie, R., 2010. Extreme Data Mining: Inference from Small Datasets. International Journal of Computers, Communications and Control, 5(3), pp. 280-291. 4. Arora, N., Saini, J. R., 2014. A Literature Review on Recent Advances in Neuro-Fuzzy Applications. International Journal of Advanced Networking Applications, pp. 14-20. 5. Bhattacharya, B., and Solomatine, D. P., 2006. Machine Learning in Sedimentation Modelling. Neural Networks, 19, pp. 208-214. 6. Chao, G. Y., Tsai, T. I., Lu, T. J., Hsu, H. C., Bao, B. Y., Wu, W. Y., Lin, M. T., and Lu, T. L., 2011. A new approach to prediction of radiotherapy of bladder cancer cells in small dataset analysis. Journal of Expert Systems with Applications,38(7), pp. 7963-7969. 7. Chen, H. Y., Li, D. C., and Lin, L. S., 2016. Extending Sample Information for Small Data Set Prediction. 5th IIAI International Congress on Advanced Applied Informatics, pp. 710-714. 8. Chen, Z. S., Zu, B., He, Y. L., Yu, L. A., 2017. A PSO based Virtual Sample Generation Method for Small Sample Sets: Applications to Regression Datasets. Engineering Applications of Artificial Intelligence, 59, pp. 236-243. 9. Chen, Z., Zhu, Y., Di, Y., and Feng, S., 2015. Self-Adaptive Prediction of Cloud Using Ensemble Model and Subtractive-fuzzy Clustering Based Neural Network. Computational Intelligence and Neuroscience, pp. 1-14. 10. Enke, D., and Mehdiyev, N., 2014. A Hybrid Neuro-Fuzzy Model to Forecast Inflation. Proceeding of Computer Science, 36, pp. 254-260. 11. Feng, S., Zhou, H., and Dong, H., 2019. Using Deep Neural Network with Small Dataset to Predict Material Defects. Materials and Design, 162, pp. 300-310. 12. Garg, H., 2016. A Hybrid PSO-GA Algorithm for Constrained Optimization Problems. Applied Mathematics and Computation, 274, pp. 292-305. 13. Gong, H. F., Chen, Z. S., Zhu, Q. X., and He, Y. L., 2017. A Monte Carlo and PSO Based Virtual Sample Generation Method for Enhancing the Energy Prediction and Energy Optimization on Small Data Problem: An Empirical Study of Petrochemical Industries. Applied Energy, 197, pp. 405-415. 14. Huang, C., and Moroga, C., 2004. A Diffusion-Neural-Network for Learning from Small Samples. International Journal of Approximate Reasoning, 35(2), pp. 137-161. 15. Huang, C., 1997. Principle of Information Diffusion. Fuzzy Sets and Systems, 91(1), pp. 69-90. 16. Huang, C. J., Wang, H. F., Chiu, H. J., Lan, T. H., Hu, T. M., and Loh, E. W., 2010. Prediction of The Period of Psychotic Episode in Individual Schizophrenics by Simulation-Data Construction Approach. Journal of Medical Systems, 34(5), pp. 799-808. 17. Hussain, M., Zhu, W., Zhang, W., Muhammad, S., Abidi, R., and Ali, S., 2018. Using Machine Learning to Predict Student Difficulties from Learning Session Data. Artificial Intelligence Review, pp. 381-407. 18. Hwe, E. K., and Izzah, Z., 2018. Hybrid Intelligent System. Delhi: Springer International Publishing AG, part of Springer Nature 2018. 19. Ismail, M., Ibrahim, R., and Idris, I., 2011. Adaptive Neural Network Prediction Model for Energy Consumption. 3rd International Conference on Computer Research and Development, pp. 109-113. 20. Ivanesu, V. C., Bertrand, J. W. M., Fransoo, J. C., and Kleijnen, J. P. C., 2006. Bootstrapping to Solve the Limited Data Problem in Production Control: An Application in Batch Process Industries. Journal of the Operational Research Society, 57(1), pp. 2-9. 21. Jamil, J. M. and Shaharanee, I. N. M., 2014. Comparative Analysis of Data Mining Techniques for Business Data. Proceedings of AIP Conference, 1635, pp. 587-593. 22. Jang, J. S. R., 1993. ANFIS: Adaptive-Network-Based Fuzzy Inference System. IEEE Transactions on Systems, Man and Cybernetics, 23(3), pp. 665-685. 23. Jiang, Z., and Rui, Z., 2015. Particle Swarm Optimization Applied to Hypersonic Reentry Trajectories. Chinese Journal of Aeronautics, 28(3), pp. 822-831. 24. Kucuksille, E. U., Selbas, R., and Sencan, A., 2009. Data Mining Techniques for Thermophysical Properties of Refrigerants. Energy Conversion and Management, 50, pp. 399-412. 25. Li, D. C., Lin, W. K., Chen, C. C., Chen, H. Y., and Lin, L. S., 2018. Rebuilding Sample Distributions for Small Dataset Learning. Decision Support Systems, 105, pp. 66-76. 26. Li, D. C., Lin, W. K., Lin, L. S., Chen, C. C., and Huang, W. T., 2016a. The Attribute-Trend-Similarity Method to Improve Learning Performance for Small Datasets. International Journal of Production Research, 55(7), pp. 1–16. 27. Li, D. C., Wen, I. H., and Chen, W. C., 2016b. A Novel Data Transformation Model for Small Data-Set Learning. International Journal of Production Research, 54(24), pp. 1-11. 28. Li, D. C., Chen, W. C., Chang, C. J., Chen, C. C., and Wen, I. H., 2015. Practical Information Diffusion Techniques to Accelerate New Product Pilot Runs. International Journal of Production Research, 53(17), pp. 5310-5319. 29. Li, D. C., Lin, L. S., and Peng, L. J., 2014a. Improving Learning Accuracy by Using Synthetic Samples for Small Datasets with Non-Linear Attribute Dependency. Decision Support Systems. 59(1), pp. 286-295. 30. Li, D. C., and Wen, I. H., 2014b. A genetic Algorithm-Based Virtual Sample Generation Technique to Improve Small Data Set Learning. Neurocomputing, 143, pp. 222-230. 31. Li, D. C., Huang, W. T., Chen, C. C., and Chang, C. J., 2013a. Employing Virtual Samples to Build Early High-Dimensional Manufacturing Models. International Journal of Production Research, 51(11), pp. 3206-3224. 32. Li, D. C., and Lin, L. S., 2013b. A New Approach to Assess Product Lifetime Performance for Small Data Sets. European Journal of Operational Research, 230(2), pp. 290-298. 33. Li, D. C., Chang, C. C., Liu, C. W., and Chen, W. C., 2013c. A New Approach for Manufacturing Forecast Problems with Insufficient Data: The Case of TFT-LCDs. Journal of Intelligent Manufacturing, 24(2), pp. 225-233. 34. Li, D. C., Wen, I. H., and Chang, C. C., 2013d. A New Virtual-Sample-Generating Method Based on the Heuristics Algorithm. Proceedings of IEEE International Conference on Grey Systems and Intelligent Services, pp. 469-472. 35. Li, D. C., Chen, C. C., and Chen, W. C., 2012a. Employing Box-And-Whisker Plots for Learning More Knowledge in TFT-LCD Pilot Runs. International Journal of Production Research, 50(6), pp. 1539-1553. 36. Li, D. C., Chen, C. C., Chang, C. J., and Lin, W. K., 2012b. A Tree-Based-Trend-Diffusion Prediction Procedure for Small Sample Sets in The Early Stages of Manufacturing Systems. Expert Systems with Applications, 39(1), pp. 1575-1581. 37. Li, D. C., and Liu, C. W., 2012c. Extending Attribute Information for Small Data Set Classification. IEEE Transactions on Knowledge and Data Engineering, 24(3), pp. 452-464. 38. Li, D. C., Chang, C. C., Liu, C. W., and Chen, W. C., 2011a. A New Approach for Manufacturing Forecast Problems with Insufficient Data: The Case of TFT-LCD. Journal of Intelligent Manufacturing, pp. 1-9. 39. Li, D. C., Liu, C. W., and Hu, S. C., 2011b. A Fuzzy-Based Data Transformation for Feature Extraction to Increase Classification Performance with Small Medical Data Sets. Artificial Intelligence in Medicine, 52(1), pp. 45-52. 40. Li, D. C., Tsai, T. I., and Shi, S., 2009a. A Prediction of the Dielectric Constant of Multi-layer Ceramic Capacitors using the Mega-Trend-Diffusion Technique in Powder Pilot Runs: Case Study. International Journal of Production Research, 47(1), pp. 51-69. 41. Li, D. C., and Liu, C. W., 2009b. A Neural Network Weight Determination Model Designed Uniquely for Small Data Set Learning. Expert Systems with Applications, 36(6), pp. 9853-9858. 42. Li, D. C., and Yeh, C. W., 2008. A Non-Parametric Learning Algorithm for Small Manufacturing Data Sets. Expert Systems with Applications, 34(1), pp. 391-398. 43. Li, D. C., Hsu, H. C., Tsai, T. I., and Hu, S. C., 2007a. A New Method to Help Diagnose Cancers for Small Sample Size. Expert Systems with Applications, 33(2), pp. 420-424. 44. Li, D. C., Wu, C. S., Tsai, T. I., and Lina, Y. S., 2007b. Using Mega-Trend-Diffusion and Artificial Samples in Small Data Set Learning for Early Flexible Manufacturing System Scheduling Knowledge. Computers and Operations Research, 34(4), pp. 966-982. 45. Li, D. C., Wu, C. S., Tsai, T. I., and Chang, F. M., 2006. Using Mega-Fuzzification and Data Trend Estimation in Small Data Set Learning for Early FMS Scheduling Knowledge. Computers and Operations Research, 33(6), pp. 1857-1869. 46. Li, D. C., Wu, C., and Chang, F. M., 2005. Using Data-Fuzzification Technology in Small Data Set Learning to Improve FMS Scheduling Accuracy. International Journal of Advanced Manufacturing Technology, 27(3–4), pp. 321-328. 47. Li, D. C., Chen, L. S., and Lin, Y. S., 2003. Using Functional Virtual Population as Assistance to Learn Scheduling Knowledge in Dynamic Manufacturing Environments. International Journal of Production Research, 41(17), pp. 4011-4024. 48. Lin, L. S., Li, D. C., Chen, H. Y., and Chiang, Y. C., 2018. Neurocomputing an Attribute Extending Method to Improve Learning Performance for Small Datasets. Neurocomputing, pp. 75-87. 49. Lin, Y. S., and Tsai, T. I., 2013. Using Virtual Data Effects to Stabilize Pilot Run Neural Network Modeling. Proceedings of 2013 IEEE International Conference on Grey systems and Intelligent Services, pp. 463-468. 50. Lin, Y. S., and Li, D. C., 2010. The Generalized-Trend-Diffusion Modeling Algorithm for Small Data Sets in the Early Stages of Manufacturing Systems. European Journal of Operational Research. 207(1), pp. 121-130. 51. Luukka, P., 2008. Similarity Classifier in Diagnosis of Bladder Cancer. Computer Methods and Programs in Biomedicine, 89(1), pp. 43-49. 52. Mahmoud, L., and Zohair, A., 2019. Prediction of Student‟s Performance by Modelling Small Dataset Size. International Journal of Educational Technology in Higher Education, pp. 1-18. 53. Maimon, O., and Rokach, L., 2005. Introduction to Knowledge Discovery in Databases. Data Mining and Knowledge Discovery Handbook, pp. 1-17. 54. Miaoquan, L., Dunjun, C., Aiming, X., and Li, L., 2002. An Adaptive Prediction Model of Grain Size for the Forging of Ti - 6Al - 4V Alloy based on Fuzzy Neural Networks. Journal of Materials Processing Technology, 123(2002), pp. 377–381. 55. Niyogi, P., Girosi, F., and Poggio, T., 1998. Incorporating Prior Information in Machine Learning by Creating Virtual Examples. Proceedings of the IEEE., 86(11), pp. 2196-2209. 56. Patki, N., and Wedge, R., 2016. The Synthetic Data Vault. IEEE International Conference on Data Science and Advanced Analytics, pp. 399-410. 57. Quinlan, J. R., 1992. Learning With Continuous Classes. World Scientiic, 92, pp. 343-348. 58. Ruparel, N. H., Shahane, N. M., and Bhamare, D. P., 2013. Learning from Small Data Set to Build Classification Model: A Survey. International Conference on Recent Trends in engineering and Technology, pp. 23-26. 59. Salman, M. A., Popoola, S. I., Faruk, N., Surajudeen, B. N. T., Oloyede, A. A., Olawoyin, L. A., 2017. Adaptive Neuro-Fuzzy Model for Path Loss Prediction in The VHF Band. Proceedings of the IEEE International Conference on Computing, Networking and Informatics, pp. 1-6. 60. Sen, S., Sezer, E. A., Gokceoglu, C., and Yagiz, S.,2014. On Sampling Strategies for Small And Continuous Data with The Modeling Of Genetic Programming And Adaptive Neuro-Fuzzy Inference System. Journal of Intelligent and Fuzzy Systems, 23(6), pp. 297-304. 61. Sezer, E. A., Nefeslioglu, H. A., and Gokceoglu, C., 2014. An Assessment on Producing Synthetic Samples by Fuzzy C-means for Limited Number of Data in Prediction Models. Applied Soft Computing Journal, 24, pp. 126-134. 62. Shaikhina, T., and Khovanova, N. A., 2017. Handling Limited Datasets with Neural Networks in Medical Applications: A Small-Data Approach. Artificial Intelligence in Medicine, 75, pp. 51-63. 63. Slifker, J. F., and Shapiro, S. S., 1980. The Johnson System: Selection and Parameter Estimation. Technometrics, 22(2), pp. 239-246. 64. Solomatine, D. P., and Xue, Y., 2004. M5 Model Trees and Neural Networks: Application to Flood Forecasting in the Upper Reach of the Huai River in China. Journal of Hydrologic Engineering, 9(6), pp. 491-501. 65. Soman, K. P., Diwakar, S., and Ajay, V., 2005. Data Mining: Theory and Practice. 66. Sophos, N., 2017. Man vs Machine: Comparing Artificial and Biological Neural Network. [online] Available at: https://news.sophos.com/en-us/2017/09/21/man-vs-machine-comparing-artificial-and-biological-neural-networks/amp/ [Accessed on 3 August 2017]. 67. Steyerberg, E. W., 2009. Clinical Prediction Models In Applications of Prediction Models. Springer Science and Business Media Publication. 68. Sujit, S., 2015. Two Sides of the MAPE Coin - Supply Chain Link Blog. [online] Available at: https://blog.arkieva.com/two-sides-of-the-mape-coin/ [Accessed on 1 June 2017]. 69. Tsai, C. H., and Li, D. C., 2015. Improving Knowledge Acquisition Capability of M5‟ Model Tree on Small Datasets. 3rd International Conference on Applied Computing and Information Technology, pp. 379-386. 70. Tsai, T. I., and Li, D. C., 2008. Utilize Bootstrap In Small Data Set Learning For Pilot Run Modeling Of Manufacturing Systems. Expert Systems with Applications, 35(3), pp. 1293-1300. 71. Vabalas, A., Gowen, E., Poliakoff, E., and Casson, A. J., 2019. Machine Learning Algorithm Validation with a Limited Sample Size. Plos One, 14(11), pp. 1-12. 72. Warren, L. T., 2011. Diagnosis of Bladder Cancers wth Small Sample Size via Feature Selection. Expert Systems with Applications. 38(4), pp. 4649-4654. 73. Xiao, G., Ma, S., Minna, J., and Xie, Y., 2014. Adaptive Prediction Model in Prospective Molecular Signature-Based Clinical Studies. Clinical Cancer Research, 20(3), pp. 531-539. 74. Xie, H. B., Jiang, Z. Y., Tieu, A. K., Liu, X. H., and Wang, G. D., 2008. Prediction of Rolling Force Using an Adaptive Neural Network Model During Cold Rolling of Thin Strip. International Journal of Modern Physics B, 22, pp. 5723-5727. 75. Yang, T., and Kecman, V., 2009. Adaptive Local Hyperplane Algorithm for Learning Small Medical Data Sets. Expert Systems, 26(4), pp. 355-359. 76. Yang, J., Rivard, H., and Zmeureanu, R., 2005. On-line Building Energy Prediction Using Adaptive Artificial Neural Networks. Energy and Buildings, 37(12), pp. 1250-1259. 77. Zabell, S. L., 2008. On student‟s 1908 article “the probable error of a mean”. Journal of the American Statistical Association, 103(481), pp. 1-7. 78. Zang, W., Zhang, P., Zhou, C., and Guo, L., 2014. Comparative Study Between Incremental and Ensemble Learning on Data Streams: Case Study. Journal of Big Data, 1(1), pp. 1-16. 79. Zarei, M., 2017. Spike Discharge Prediction Based on Neuro-Fuzzy System. Journal of Unexplored Medical Data, 2(2), pp. 88-101. 80. Zheng, Qi., Chen, Jun., Jiang, Junjun., and Hu, Ruimin., 2018. Reinforcing Pedestrian Parsing on Small Scale Dataset. Multimedia Modelling, 3, pp. 417–427. 81. Zhu, B., Chen, Z., and Yu, L., 2016. A Novel Mega-Trend-Diffusion for Small Sample. Huagong Xuebao/CIESC Journal, 67(3), pp. 820-826. |