Local threshold identification and gray level classification of butt joint welding imperfections using robot vision system

This research is carried out be able to automatically identify the joint position and classify the quality level of imperfections for butt welding joint based on background subtraction, local thresholding and gray level approaches without any prior knowledge of the joint shapes. The background subtr...

Full description

Saved in:
Bibliographic Details
Main Author: Mohd Shah, Hairol Nizam
Format: Thesis
Language:English
English
Published: 2018
Subjects:
Online Access:http://eprints.utem.edu.my/id/eprint/22417/1/Local%20Threshold%20Identification%20And%20Gray%20Level%20Classification%20Of%20Butt%20Joint%20Welding%20Imperfections%20Using%20Robot%20Vision%20System.pdf
http://eprints.utem.edu.my/id/eprint/22417/2/Local%20threshold%20identification%20and%20gray%20level%20classification%20of%20butt%20joint%20welding%20imperfections%20using%20robot%20vision%20system.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
id my-utem-ep.22417
record_format uketd_dc
institution Universiti Teknikal Malaysia Melaka
collection UTeM Repository
language English
English
advisor Sulaiman, Marizan
topic T Technology (General)
TS Manufactures
spellingShingle T Technology (General)
TS Manufactures
Mohd Shah, Hairol Nizam
Local threshold identification and gray level classification of butt joint welding imperfections using robot vision system
description This research is carried out be able to automatically identify the joint position and classify the quality level of imperfections for butt welding joint based on background subtraction, local thresholding and gray level approaches without any prior knowledge of the joint shapes. The background subtraction and local thresholding approaches consist of image pre-processing, noise reduction and butt welding representation algorithms. The approaches can automatically recognize and locate the butt joint position of the starting, middle, auxiliary and ending point according to the three different joint shapes; straight line, tooth saw and curved joint shapes. The welding process was done by implemented an automatic coordinate conversion between camera (pixels) and KUKA welding robot coordinate (millimeters) from the KUKA welding robot and camera coordinate ratio. The ratio was determined by a camera and three reference point (origin, x-direction and y-direction) taken around workpiece. Hence, the quality level of imperfection for butt welding joint was classified using Gaussian Mix Model (GMM), Multi-Layer Perceptron (MLP) and Support Vector Machine (SVM) classifiers according to their class of imperfection categories; good welds, excess welds, insufficient welds and no weld in each welding joint shape. These classifiers introduced 72 characteristics of feature values of gray pixels taken from co-occurrence matrix. The feature values consist of energy, correlation, homogeneity and contrast combine with gray absolute histogram of edge amplitude including additional characteristic features with scaled image factor by 0.5. The proposed approaches were validated through experiments with a KUKA welding robot in a realistic workshop environment. The results show that the approaches introduced in this research can detect, identify, recognize, locate the welding position and classify the quality level of imperfections for butt welding joint automatically without any prior knowledge of the joint shapes.
format Thesis
qualification_name Doctor of Philosophy (PhD.)
qualification_level Doctorate
author Mohd Shah, Hairol Nizam
author_facet Mohd Shah, Hairol Nizam
author_sort Mohd Shah, Hairol Nizam
title Local threshold identification and gray level classification of butt joint welding imperfections using robot vision system
title_short Local threshold identification and gray level classification of butt joint welding imperfections using robot vision system
title_full Local threshold identification and gray level classification of butt joint welding imperfections using robot vision system
title_fullStr Local threshold identification and gray level classification of butt joint welding imperfections using robot vision system
title_full_unstemmed Local threshold identification and gray level classification of butt joint welding imperfections using robot vision system
title_sort local threshold identification and gray level classification of butt joint welding imperfections using robot vision system
granting_institution Universiti Teknikal Malaysia Melaka
granting_department Faculty Of Electrical Engineering
publishDate 2018
url http://eprints.utem.edu.my/id/eprint/22417/1/Local%20Threshold%20Identification%20And%20Gray%20Level%20Classification%20Of%20Butt%20Joint%20Welding%20Imperfections%20Using%20Robot%20Vision%20System.pdf
http://eprints.utem.edu.my/id/eprint/22417/2/Local%20threshold%20identification%20and%20gray%20level%20classification%20of%20butt%20joint%20welding%20imperfections%20using%20robot%20vision%20system.pdf
_version_ 1747834021097766912
spelling my-utem-ep.224172022-06-03T16:47:20Z Local threshold identification and gray level classification of butt joint welding imperfections using robot vision system 2018 Mohd Shah, Hairol Nizam T Technology (General) TS Manufactures This research is carried out be able to automatically identify the joint position and classify the quality level of imperfections for butt welding joint based on background subtraction, local thresholding and gray level approaches without any prior knowledge of the joint shapes. The background subtraction and local thresholding approaches consist of image pre-processing, noise reduction and butt welding representation algorithms. The approaches can automatically recognize and locate the butt joint position of the starting, middle, auxiliary and ending point according to the three different joint shapes; straight line, tooth saw and curved joint shapes. The welding process was done by implemented an automatic coordinate conversion between camera (pixels) and KUKA welding robot coordinate (millimeters) from the KUKA welding robot and camera coordinate ratio. The ratio was determined by a camera and three reference point (origin, x-direction and y-direction) taken around workpiece. Hence, the quality level of imperfection for butt welding joint was classified using Gaussian Mix Model (GMM), Multi-Layer Perceptron (MLP) and Support Vector Machine (SVM) classifiers according to their class of imperfection categories; good welds, excess welds, insufficient welds and no weld in each welding joint shape. These classifiers introduced 72 characteristics of feature values of gray pixels taken from co-occurrence matrix. The feature values consist of energy, correlation, homogeneity and contrast combine with gray absolute histogram of edge amplitude including additional characteristic features with scaled image factor by 0.5. The proposed approaches were validated through experiments with a KUKA welding robot in a realistic workshop environment. The results show that the approaches introduced in this research can detect, identify, recognize, locate the welding position and classify the quality level of imperfections for butt welding joint automatically without any prior knowledge of the joint shapes. 2018 Thesis http://eprints.utem.edu.my/id/eprint/22417/ http://eprints.utem.edu.my/id/eprint/22417/1/Local%20Threshold%20Identification%20And%20Gray%20Level%20Classification%20Of%20Butt%20Joint%20Welding%20Imperfections%20Using%20Robot%20Vision%20System.pdf text en public http://eprints.utem.edu.my/id/eprint/22417/2/Local%20threshold%20identification%20and%20gray%20level%20classification%20of%20butt%20joint%20welding%20imperfections%20using%20robot%20vision%20system.pdf text en validuser https://plh.utem.edu.my/cgi-bin/koha/opac-detail.pl?biblionumber=111332 phd doctoral Universiti Teknikal Malaysia Melaka Faculty Of Electrical Engineering Sulaiman, Marizan 1. Akkas, N., Karayel , D., Ozkan, S., S., Ogur, A. and Topal, B., 2013. Modeling and analysis of the weld bead geometry in submerged arc welding by using adaptive neurofuzzy inference system, Mathematical Problems in Engineering, pp. 1-10. 2. Aoki, L. and Suga, Y., 1998. Application of artificial neural network to discrimination of defect type in automatic radiographic testing of welds. ISIJ International Journal, 39(10), pp. 1081–1087. 3. Bai, Y., Zhuang, H. and Roth, Z., S., 2003. Experiment Study of PUMA Robot Calibration Using a Laser Tracking System, IEEE international Workshop on Soft Computing in Industrial Applications, pp. 1-6. 4. Barrois, B. and Wöhler, C., 2007. 3D pose estimation based on multiple monocular cues, Proceeding of IEEE Conference on Computer Vision and Pattern Recognition, pp. 1-8. 5. Bouguest. R., 2008. Camera calibration toolbox for Matlab. [online] Available at: http://www.vision.caltech.edu/bouguetj/calib_doc/ [Accessed on 11 Oktober 2015]. 6. Broberg, P., Sjodahl, M. and Runnemalm, A., 2015. Comparison of NDT–methods for automatic inspection of weld defects, International Journal of Materials and Product Technology, 50(1), pp. 1-21.170 7. Chen, X. B., Chen, S. and Lin, T., 2007. Recognition of macroscopic seam for complex robotic welding environment. In: Tarn TJ, et al. (eds) Robotic Welding, Intelligence and Automation. LNCIS. Springer, Berlin, pp 171–178. 8. Chen, X. Z. and Chen, S. B., 2010. The autonomous identification and guiding of start position for arc welding robot. Industrial Robot: An International Journal, 37(1), pp. 70- 78. 9. Chen., X., Z., Huang, Y., M. and Chen, S., B., 2012. Model analysis and experimental technique on computing accuracy of seam spatial position information based on stereo vision for welding robot, Industrial Robot: An International Journal, 39(4), pp. 349-356. 10. Craig, J., J., 2005. Introduction to Robotics - Mechanics and Control, 3rd Edition., Pearson Prentice Hall. 11. Daeinabi, K. and Teshnehlab. M., 2006. Seam tracking of intelligent arc welding robot. In: Proceedings of The 6th WSEAS Int. Conf. on Systems Theory & Scientific Computation, Elounda, Greece, pp 161–166. 12. Da Silva, R.R., Siqueira, M.H.S., Caloba, L.P. and Rebello, J.M.A., 2001. Radiographics pattern recognition of welding defects using linear classifier. International Institute of Welding. 43(10), pp. 669-674. 13. Da Silva, R.R., Siqueira, M.H.S., Sonza, M.P.V., Caloba, L.P. and Rebello, J.M.A., 2004. Estimated accuracy of classification of defects detected in welded joints by radiographic171 tests. NDT & E International Journal, 38(5), pp. 335–343. 14. Da Silva, R.R., Siqueira, M.H.S., Caloba, L.P. and Rebello, J.M.A., 2003. Patterns nonlinear classifiers of weld defects in industrial radiographies. American Conferences for Nondestructive Testing, pp. 1-12. 15. Davies, E. R., 2012. Computer and machine vision: theory, algorithms, practicalities. Academic Press. 16. Dinham, M. and Fang, G., 2014. Detection of fillet weld joints using an adaptive line growing algorithm for robotic arc welding. Journal Robotics Computer Integration and Manufacturing, 30, pp. 229–243. 17. Dinham, M. and Fang, G., 2013. Autonomous weld seam detection and localisation using eye-in-hand stereo vision for robotic arc welding, Journal Robotics Computer Integrated Manufacturing, 29(5), pp. 288–301. 18. Dinham, M. and Fang, G., 2012. Weld seam detection using computer vision for robotic arc welding. In: Proceedings of the 2012 IEEE International Conference on Automation Science and Engineering, pp 679–774. 19. Doring, C., Eichhorn, A., Wang, X. and Kruse, R., 2004. Improving Surface Defect Detection for Quality Assessment of Car Body Panels. Mathware & Soft Computing, 11(3), pp. 163-177.172 20. Doyle, D., D., Jennings, A., L. and Black, J., 2014. Optical flow background estimation for real-time pan/tilt camera object tracking, Measurement, 48(1), pp. 195–207. Eichhorn, A., Girimonte, D., Klose, A. and Kruse, R., 2005a. Surface quality analysis with soft computing. Applied Soft Computing, 5(3), pp. 301–313. 21. Eichhorn, A., Girimonte, D., Klose ,A. and Kruse, R., 2005b. Soft Computing for Automated Surface Quality Analysis of Exterior Car Body Panels. Applied Soft Computing, 5(3), pp. 301-313. 22. Fang, Z., Xu, D. and Tan, M., 2013. Vision-based initial weld point positioning using the geometric relationship between two seams. The International Journal of Advanced Manufacturing Technology, 66(9–12), pp. 1535-1543. 23. Gatla, C., Lumia, R., Wood, J. and Starr, G., 2007. An Automated Method to Calibrate Industrial Robots Using a Virtual Closed Kinematic Chain, IEEE Transactions on Robotics, 23, pp. 1105-1116. 24. Gao, F., Chen, Q. and Guo, L., 2015. Study on arc welding robot weld seam touch sensing location method for structural parts of hull, International Conference on Control, Automation and Information Sciences. pp. 42-46.173 25. Gao, X., Mo, L., You, D. and Li., Z., 2017. Tight butt joint weld detection based on optical flow and particle filtering of magneto-optical imaging, Mechanical Systems and Signal Processing, 96, pp. 16-30. 26. Gao, X., Ding, D. and Bai, T., 2009. Weld pool image centroid algorithm for seam tracking in arc welding process. In: International Workshop on Imaging Systems and Technique, pp. 385–390. 27. Gao, X. and Chen, Y., 2014. Detection of micro gap weld using magneto-optical imaging during laser welding, The International Journal of Advanced Manufacturing Technology, 73(4), pp. 23-33. 28. Georgieva, A. and Jordanov, I., 2009. Intelligent visual recognition and classification of cork tiles with neural networks. Neural Networks, IEEE Transactions on, 20(4), 675-685. 29. Goa, F., Li, Z., Xio, G., Yuan, X. and Han, Z., 2012. An Online Inspection System of Surface Defects for Copper Strip Based on Computer Vision. Chongging, Sichuan, China: 5th International Congress on Image and Signal Processing, pp. 1200-1204. 30. Graaf, M. D., Aarts, R., Jonker, B. and Meijer, J., 2010. Real-time seam tracking for robotic laser welding using trajectory-based control, Journal Control Engineering Practical, 18, pp. 944–953. 31. Gu, W., P., Xiong, Z., Y. and Wan, W., 2013. Autonomous seam acquisition and tracking174 system for multi-pass welding based on vision sensor, International Journal of Advanced Manufacturing Technology, 69(1–4), pp. 451-460. 32. Haralick, R. M., Shanmugam, K. and Dinstein, I. H., 1973. Textural features for image classification. Systems, Man and Cybernetics, IEEE Transactions, 6, pp. 610-621. 33. Hartley, R., I. and Zisserman, A., 2003. Multiple View Geometry in Computer Vision 2nd Edition ed. Cambridge, UK: Cambridge University Press. 34. He, Y., Xu, Y., Chen, Y., Chen, H. and Chen, S., 2016. Weld seam profile detection and feature point extraction for multi-pass route planning based on visual attention model, Robotics and Computer-Integrated Manufacturing, 37, pp. 251-261. 35. Hou, X. and Liu, H., 2012. Welding image edge detection and identification research based on canny operator, International Conference on Computer Science & Service System (CSSS), pp. 250-253. 36. Horaud, R. and Dornaika, F., 1995. Hand-eye calibration, International Journal on Robotics Research, 14(3), pp. 195-210. 37. Horaud, R. and Dornaika, F., 1998. Simultaneous robot-world and hand-eye calibration, IEEE transactions on Robotics and Automation, 14(4), pp. 617-622. 38. Huang, Y., Xiao, Y., L., Wang, P., 1. and Li, M., Z., 2013. A seam-tracking laser welding platform with 3D and 2D visual information fusion vision sensor system, The International175 Journal of Advanced Manufacturing Technology, 67(1), pp. 415-426. 39. Ismail, M., I., S., Okamoto, Y. and Okada, A., 2013. Neural network modeling for prediction of weld bead geometry in laser microwelding, Advances in Optical Technologies, pp. 1-7. 40. Jelen, L., Fevens, T. and Krzyzak, A., 2008. Classification of Breast Cancer Malignancy Using Cytological Images of Fine Needle Aspiration Biopsies. International Journal Appl. Math. Computer Sciences, 18(1), pp.75-83. 41. Jong, P. L., Qian, Q. W., Min, H. Park., Cheol, K. P. and Ill, S., K., 2014. A Study on Optimal Algorithms to Find Joint Tracking in GMA Welding. International Journal of Engineering Science and Innovative Technology, 3(1), pp. 370-388. 42. Kamani, P., Noursadeghi, E., Afshar, P. and Towhidkhah, F., 2011b. Automatic paint defect detection and classification of car body. 7th Machine Vision and Image Processing (MVIP), pp. 1-6. 43. Kamani, P., Afshar, A., Towhidkhah, F. and Roghani, E., 2011a. Car body paint defect inspection using rotation invariant measure of the local variance and one-against-all support vector machine. 1st International Conferences on Informatics and Computational Intelligence (ICI), pp. 244-249. 44. Kang, H., K., Jeong, J., Shin, S., Suh, Y. and Ro, Y., 2007. Autonomous Kinematic Calibration of the Robot Manipulator with a Linear Laser-Vision Sensor. Advanced176 Intelligent Computing Theories and Applications, Lecture Notes in Computer Science, 4682, pp. 1102-1109. 45. Kendal, D., 2007. Measuring distances using digital cameras. Journal of Australian Senior Mathematics, 21(2), pp. 24-28. 46. Kiddee, P., Fang, Z. and Tan, M., 2016. An automated weld seam tracking system for thick plate using cross mark structured light, International Journal of Advanced Manufacturing Technology, 87(9-12), pp 3589-3603. 47. Kiddee, P., Fang, Z. J. and Tan, M., 2014. Visual recognition of the initial and end points of lap joint for welding robots. In: Proceeding of The IEEE International Conference on Information and Automation Hailar, pp. 1-8. 48. Konzlov, V., V., Lapik, N., V. and Popova, N., V., 2018. Neural network expert system for X-ray analysis of welded joints, IOP Conf. Series: Materials Science and Engineering, 327. pp. 1-5. 49. Krishnan, J., V., G., Manoharan, N. and Rani, B., S., 2010. Estimation of Distance to Texture Surface Using Complex Log Mapping. Journal of Computer Application, 3(3), pp. 16-21. 50. Kumar, G.S., Natarajan, U., Veerarajan, T. and Ananthan, S.S., 2014. Quality level assessment for imperfections in GMAW. Welding Journal, 93(3), pp. 85-97.177 51. Kumar, G.S., Natarajan, U. and Ananthan, S.S., 2012. Vision inspection system for the identification and classification of defects in MIG welding joints. International Journal of Advanced Manufacturing Technology (IJAMT), 61(9), pp. 923–33. 52. Kong, M., Shi, F., Chen, S. and Lin, T., 2007. Recognition of the initial position of weld based on the corner identification for welding robot in global environment. In: Tarn TJ, et al. (eds) Robotic Welding Intelligence and Automation, LNCIS, 362. Springer, Berlin, pp. 249–255. 53. KUKA Robot Programming., 2015. Training Manual. 54. Le Brese, C., Zou, J. and Uy, B., 2010. An Improved ASIFT Algorithm for Matching Repeated Patterns, IEEE 17th International Conference on Image Processing, pp. 2949- 2952. 55. Lei, S., Jingtai, L., Weiwei, S., Shuihua, W. and Xingho, H., 2004. Geometry-Based Robot Calibration Method, IEEE lnternational Conference on Robotics & Automation, pp. 1907- 1912. 56. Liao, T.W., 2009. Improving the accuracy of computer-aided radiographic weld inspection by feature selection. NDT & E International Journal, 42(4), pp. 229–39. 57. Liao, T.W., 2003 Classification of welding flaw types with fuzzy expert systems. Expert Systems with Applications, 25(1), pp. 101–11.178 58. Liao, T.W. and Li, Y.M., 1998. An automated radiographic NDT system for weld inspection: Part II. Flaw detection. NDT & E International Journal, 31(3), pp. 183–92. 59. Li, A., Ma, Z., Tian, Y. and Xu, H., 2009. Hand-eye calibration of hand mounted laser range finder robot system, Proceedings of the 4th IEEE Conference on Industrial Electronics and Applications, pp. 2570-2572. 60. Li, J., Zhao, H., Jiang, T. and Zhou, X., D., 2008. Development of a 3D High-Precise Positioning System Based on a Planar Target and Two CCD Camera. Proceedings of the First International Conference on Intelligent Robotics and Applications: Part II, pp. 475- 484. 61. Li, J., Q., Chen, S., B. and Wu, L., 2002. A simple method of camera and hand-eye simultaneous calibration, Journal of Shanghai Jiaotong University, 36(12), pp. 114-117. 62. Li, W., B., Cao, G., Z., Sun, J., D., Liang, Y., X. and Huang, S., D., 2017. A calibration algorithm of the structured light vision for the arc welding robot, 14th International Conference on Ubiquitous Robots and Ambient Intelligence, pp. 481-483. 63. Li, W., H., Gao, K., Wu, J., Hu, T. and Wang, J., Y., 2014. SVM-based information fusion for weld deviation extraction and weld groove state identification in rotating arc narrow gap mag welding, International Journal of Advanced Manufacturing Technology, 74(9– 12), pp. 1355-1364. 64. Li, X., D. and Li, X., H., 2017a. Automatic welding seam tracking and identification, IEEE179 Transactions Industrial Electronics, 64(9), pp. 7261-7271. 65. Li, X., H. and Li, X., D., 2017b. Robust welding seam tracking and recognition, IEEE Sensors Journal, 17(17), pp. 5609-5617. 66. Lim, T.Y., Ratnam, M.M. and Khalid, M.A., 2007. Automatic classification of weld defects using simulated data and an MLP neural network. International Institute of Welding, 49(3), pp. 154–159. 67. Lowe, D. G., 1999. Object recognition from local scale-invariant features. International Conference on Computer Vision, pp. 1150-1157. 68. Luciane., B., S., Atila, A., W., Ricardo, N., R., Paulo, L., J., D., Bruna, G., Silvia, S., C., B. and Nelson, D., F., 2016. Seam Tracking and Welding Bead Geometry Analysis for Autonomous, Welding Robot, Conference: IEEE Latin American Robotics Symposium, pp. 1-10. 69. Moradi, A. N., Dezfulli, B. M. A., and Alvi, C. S. E., 2013. Development a Vision Based Seam Tracking System for None Destructive Testing Machines. International journal of Computer Science & Network Solutions,1(4), pp.45-56. 70. Muis, A. and Kouhei, O., 2003. An iterative approach in pose measurement through hand eye calibration, Proceedings of the IEEE conference on Control Applications, 2(23-25), pp. 983-988. 71. M. Zuliani, K. C. S. and Manjunath, B. S., 2005. The Multiransac Algorithm and its180 Application to Detect Planar Homographies. IEEE International Conference on Image Processing, pp. 153-156. 72. Nacereddine, N., Hamami, L. and Ziou, D., 2007. Image thresholding for weld defect extraction in industrial radiographic testing. International Journal of Computer, Electrical, Automation, Control and Information Engineering, 1(7), pp. 2027-2035. 73. Newman, W., S., Birkhimer, C., E. and Horning. R., J., 2000. Calibration of a Motoman P8 Robot Based on Laser Tracking, IEEE International Conference on Robotics & Automation, pp. 3597-3602. 74. Prakash, A., 2015. Vision algorithm for seam tracking in automatic welding system. International Journal of Recent advances in Mechanical Engineering (IJMECH), 4(1), pp. 1-8. 75. Qin, J., Ma, G. H. and Liu, P., 2011.Image processing algorithm of weld seam based on crawling robot by binocular vision. In: Proceedings of the 2011 IEEE international conference on industrial technology. pp. 1-10. 76. Scharstien, D. and Szeliski, R., 2002. A Taxonomy and Evaluation of Dense Two-Frame Stereo Correspondence Algorithms. International Journal of Computer Vision, 47, pp. 37- 42.181 77. Schmidt, J. and Niemann, H., 2008. Data selection for hand-eye calibration: A vector quantization approach, The International Journal of Robotics Research, 27(9), pp. 1027- 1053. 78. Shafeek, H.I., Gadelmawla, E.S., Abdel-Shafy, A.A. and Elewa, I.M., 2004a. Assessment of welding defects for gas pipeline radiographs using computer vision. NDT & E International Journal, 37(4), pp. 291–99. 79. Shafeek, H.I., Gadelmawla, E.S., Abdel-Shafy, A.A. and Elewa, I.M., 2004b. Automatic inspection of gas pipeline welding defects using an expert vision system. NDT & E International Journal, 37(4), pp. 301–307. 80. Shen, H.Y., Lin, T. and Chen, S., 2007. A study on vision based real time seam tracking in robotic arc welding. In: Tarn TJ, et al. (eds) Robotic Welding, Intelligence and Automation LNCIS. Springer, Berlin, pp. 311–318. 81. Shi, F., Zhou, L., Lin, T. and Chen, S., 2007. The autonomous identification and guiding of start position for arc welding robot. In: Tarn TJ, et al. (eds) Robotic Welding, Intelligence and Automation LNCIS Springer, Berlin, pp. 289–294. 82. Shiu, Y., C. and Ahmad, S., 1989. Calibration of wrist mounted robotic sensors by solving homogenous transform equations of form AX=XB, IEEE Transactions on Robotics and Automation, 5(1), pp. 16-29. 83. Strobl, K., H. and Hirzinger, G., 2006. Optimal hand-eye calibration, Proceeding of the182 IEEE/RSJ International conference on Intelligent Robots and Systems, pp. 4647-4653. 84. Sudheera, K. and Nandhitha, N.M., 2015. Application of Hilbert Transform for Flaw Characterization in Ultrasonic Signals. Indian Journal of Science and Technology, 8(13), pp. 1-6. 85. Sulaiman, M., Shah, M. H. N., Harun, M. H. and Kazim, M. N. F. M., 2014. Defect inspection system for shape-based matching using two cameras. Journal Theoretical and Application Information Technology (JATIT), 61(2), pp. 288–297. 86. Sulaiman, M., Shah, M. H. N., Harun, M. H., Lim, W., T. and Kazim, M. N. F. M., 2013. A 3D gluing defect inspection system using shape-based matching application from two cameras. International Review Computer Software (IRECOS), 8(8), pp.1997–2004 87. Sun, J., D., Cao, G., Z., Huang, S., D., Chen, K. and Yang, J., J., 2016. Welding seam detection and feature point extraction for robotic arc welding using laser-vision, 13th International Conference on Ubiquitous Robots and Ambient Intelligence, pp. 644-647. 88. Sun, Y., Bai, P., Sun, H. and Zhou, P., 2005. Real-time automatic detection of weld defects in steel pipe, NDT & E International Journal, 38(7), pp. 522-28. 89. Suyi, L. and Guorong, W., 2008. Fast calibration for robot welding system with laser vision, Proceedings of the IEEE Conference on Robotics, Automation and Mechatronics, pp. 706-710.183 90. Swillo, S., J. and Perzyk, M., 2013. Surface Casting Defects Inspection Using Vision System and Neural Network Techniques. Archives of Foundry Engineering, 13(4), pp. 103- 106. 91. Takarics, B., Szemes, P. T., Nemeth, G. and Korondi, P., 2008. Welding trajectory reconstruction based on the intelligent space concept. In: Proceedings of the IEEE 2008 conference on human system interactions, pp. 791–796. 92. Tao, Z., Changku, S. and Shan, C., 2007. Monocular vision measurement system for the position and orientation of remote object. International Symposium on Photo electronic Detection and Imaging, vol. 6623., pp. 1-7. 93. Tridi, M., Belaifa, S. and Nacereddine, N., 2005. Weld defect classification using EM algorithm for Gaussian mixture model. 3rd International Conferences: Sciences of Electronic, Technologies of information and Telecommunications, pp. 1-7. 94. Valavanis, I. and Kosmopoulos, D., 2010. Multiclass defect detection and classification in weld radiographic images using geometric and texture features. Expert System with Applications Journal, 37(2), pp. 7606-7614. 95. Venkatraman, B., Anishin Raj, M.M. and Vaithiyanathan, V., 2013. Weld Defect Detection using Iterative Image Reconstruction Methods. Indian Journal of Science and Technology, 6(4), pp. 4378-4383. 96. Vilar, R., Zapta. J. and Ruiz, R,, 2009. An automatic system of classification of weld184 defects in radiographic images. NDT & E International Journal, 42(5), pp. 467-76. 97. Wang, C., C., 1992. Extrinsic calibration of a vision sensor mounted on a robot, IEEE transactions on Robotics and Automation, 8(2), pp. 161-175. 98. Wang, G. and Warren, L. T., 2003. Automatic identification of different types of welding defects in radiographic images. NDT & E International Journal, 35(8), pp. 519–28. 99. Wang, P., J., Shao, W., Gong, S., H. and Li, G., 2016. High-precision measurement of weld seam based on narrow depth of field lens in laser welding. Science and Technology of Welding and Joining, 21(4), pp. 267-274. 100. Wang, Y., Sun, Y., Lv, P. and Wang, H., 2008. Detection of line weld defects based on and support vector machine multiple thresholds. NDT & E International Journal, 41(7), pp. 517-24. 101. Wu, Q., Q., Lee, J., P., Park, M., H., Jin, B., J., Kim, D., H., Park, C., K. and Kim, I., S., 2015. A study on the modified Hough algorithm for image processing in weld seam tracking, Journal of Mechanical Science and Technology, 29(11), pp. 4859-4865. 102. Yanling, X., Huanwei, Y., Jiyong, Z., Tao, L. and Shanben, C., 2012. Real-time seam tracking control technology during welding robot GTAW process based on passive vision sensor. Journal of Materials Processing Technology, 212(8), pp. 1654-1662.185 103. Ye, G., Guo, J., Sun, Z., Li, C. and Zhong, S., 2018. Weld bead recognition using laser vision with model-based classification, Robotics and Computer-Integrated Manufacturing, 52, pp. 9-16. 104. Ye, Z., Fang, G., Chen, S. and Dinham M., 2013. A robust algorithm for weld seam extraction based on prior knowledge of weld seam. Sensor, 33, pp. 125–133. 105. You, D., Y., Gao, X., D. and Katayama, S., 2015. WPD-PCA-based laser welding process monitoring and defects diagnosis by using FNN and SVM, IEEE Transactions Industrial Electronics., 62(1), pp. 628-636. 106. You, D., Y., Gao, X., D. and Katayama, S., 2014. Multisensor fusion system for monitoring high-power disk laser welding using support vector machine, IEEE Transactions on Industrial Informatics, 10(2), pp. 1285-1295. 107. Zapta, J., Vilar, R. and Ruiz, R., 2010. An adaptive-network-based fuzzy inference system for classification of welding defects. NDT & E International Journal, 43(3), pp.191-99. 108. Zhang, L., Ke, W., Han, Z. and Jiao, J., 2013. A cross structured light sensor for weld line detection on wall-climbing robot, International Conference on Mechatronics and Automation (ICMA), pp. 1179-1184. 109. Zhang, M. and Cheng, W., 2015. Recognition of mixture control chart pattern using multiclass support vector machine and genetic algorithm based on statistical and shape features, Mathematical Problems in Engineering, pp. 1-10.186 110. Zhao, Z. and Liu, Y., 2008. Integrating camera calibration and hand-eye calibration into robot vision, Proceedings on the IEEE World Congress on Intelligent Control and Automation, pp. 5721-5727. 111. Zhuang, H., Roth, Z. and Sudhakar, R., 1992. Simultaneous robot/world and tool/flange calibration by solving homogenous transformation of the form AX=YB, IEEE Transactions on Robotics and Automation, 10(4), pp. 549-554. 112. Zhuang, H., Wang, K. and Roth, Z., S., 1995. Simultaneous Calibration of a Robot and a Hand-Mounted Camera, IEEE Transactions on Robotics and Automation, 11(5), pp. 649- 659