A Framework For Classification Software Security Using Common Vulnerabilities And Exposures

The main research aim is to investigate what information is necessary to make a formal vulnerability pattern representation.This is done through the usage of formal Backus-Naur-Form syntax for the execution and presented with newly created vulnerability flow diagram.Some future works were also propo...

Full description

Saved in:
Bibliographic Details
Main Author: Hassan, Nor Hafeizah
Format: Thesis
Language:English
English
Published: 2018
Subjects:
Online Access:http://eprints.utem.edu.my/id/eprint/23353/1/A%20Framework%20For%20Classification%20Software%20Security%20Using%20Common%20Vulnerrabilities%20And%20Exposures.pdf
http://eprints.utem.edu.my/id/eprint/23353/2/A%20Framework%20For%20Classification%20Software%20Security%20Using%20Common%20Vulnerabilities%20And%20Exposures.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
id my-utem-ep.23353
record_format uketd_dc
institution Universiti Teknikal Malaysia Melaka
collection UTeM Repository
language English
English
topic Q Science (General)
QA76 Computer software
spellingShingle Q Science (General)
QA76 Computer software
Hassan, Nor Hafeizah
A Framework For Classification Software Security Using Common Vulnerabilities And Exposures
description The main research aim is to investigate what information is necessary to make a formal vulnerability pattern representation.This is done through the usage of formal Backus-Naur-Form syntax for the execution and presented with newly created vulnerability flow diagram.Some future works were also proposed to further enhance the elements in the secured soft-ware process framework.This thesis focuses on the research and development of the design, formalization and translation of the vulnerability classification pattern through a framework using common vulnerabilities and exposures data.To achieve this aim, the following work was carried out.First step is to create and conceptualized necessary meta-process.Second step is to specify the relationship between the classifiers and vulnerability classification pat-terns. This inclusive of the investigation of vulnerability classification objectives,processes,classifiers and focus domains among prominent framework.Final step is to construct the framework by establishing the formal presentation of the vulnerability classification algo-rithm.The validation process was conducted empirically using statistical method to assess the accuracy and consistency by using the precision and recall rate of the algorithm on five data sets each with 500 samples.The findings show a significant result with precision's error rate or p value is between 0.01 and 0.02 with error rate for recall's error rate is between 0.02 and 0.04.Another validation was conducted to verify the correctness of the classification by using expert opinions,and the results showed that the ambiguity of several cases were subdue. Formal-based classification framework with notation may increase accuracy and vi-sualization compared with hierarchy-tree only,but the conclusion remains tentative because of methodological limitation in the studies.
format Thesis
qualification_name Doctor of Philosophy (PhD.)
qualification_level Doctorate
author Hassan, Nor Hafeizah
author_facet Hassan, Nor Hafeizah
author_sort Hassan, Nor Hafeizah
title A Framework For Classification Software Security Using Common Vulnerabilities And Exposures
title_short A Framework For Classification Software Security Using Common Vulnerabilities And Exposures
title_full A Framework For Classification Software Security Using Common Vulnerabilities And Exposures
title_fullStr A Framework For Classification Software Security Using Common Vulnerabilities And Exposures
title_full_unstemmed A Framework For Classification Software Security Using Common Vulnerabilities And Exposures
title_sort framework for classification software security using common vulnerabilities and exposures
granting_institution UTeM
granting_department Faculty Of Information And Communication Technology
publishDate 2018
url http://eprints.utem.edu.my/id/eprint/23353/1/A%20Framework%20For%20Classification%20Software%20Security%20Using%20Common%20Vulnerrabilities%20And%20Exposures.pdf
http://eprints.utem.edu.my/id/eprint/23353/2/A%20Framework%20For%20Classification%20Software%20Security%20Using%20Common%20Vulnerabilities%20And%20Exposures.pdf
_version_ 1747834042374422528
spelling my-utem-ep.233532022-02-03T10:34:17Z A Framework For Classification Software Security Using Common Vulnerabilities And Exposures 2018 Hassan, Nor Hafeizah Q Science (General) QA76 Computer software The main research aim is to investigate what information is necessary to make a formal vulnerability pattern representation.This is done through the usage of formal Backus-Naur-Form syntax for the execution and presented with newly created vulnerability flow diagram.Some future works were also proposed to further enhance the elements in the secured soft-ware process framework.This thesis focuses on the research and development of the design, formalization and translation of the vulnerability classification pattern through a framework using common vulnerabilities and exposures data.To achieve this aim, the following work was carried out.First step is to create and conceptualized necessary meta-process.Second step is to specify the relationship between the classifiers and vulnerability classification pat-terns. This inclusive of the investigation of vulnerability classification objectives,processes,classifiers and focus domains among prominent framework.Final step is to construct the framework by establishing the formal presentation of the vulnerability classification algo-rithm.The validation process was conducted empirically using statistical method to assess the accuracy and consistency by using the precision and recall rate of the algorithm on five data sets each with 500 samples.The findings show a significant result with precision's error rate or p value is between 0.01 and 0.02 with error rate for recall's error rate is between 0.02 and 0.04.Another validation was conducted to verify the correctness of the classification by using expert opinions,and the results showed that the ambiguity of several cases were subdue. Formal-based classification framework with notation may increase accuracy and vi-sualization compared with hierarchy-tree only,but the conclusion remains tentative because of methodological limitation in the studies. 2018 Thesis http://eprints.utem.edu.my/id/eprint/23353/ http://eprints.utem.edu.my/id/eprint/23353/1/A%20Framework%20For%20Classification%20Software%20Security%20Using%20Common%20Vulnerrabilities%20And%20Exposures.pdf text en public http://eprints.utem.edu.my/id/eprint/23353/2/A%20Framework%20For%20Classification%20Software%20Security%20Using%20Common%20Vulnerabilities%20And%20Exposures.pdf text en validuser http://plh.utem.edu.my/cgi-bin/koha/opac-detail.pl?biblionumber=113299 phd doctoral UTeM Faculty Of Information And Communication Technology 1. Abbott, R., Chin, J., Donnelley, J., Konigsford, W., Takubo, S. and Webb, D., 1976. Security Analysis and Enhancement of Computer Operating Systems,NBSIR 76-1041. 2. Abd Ghani, M., 2010. An integrated and distributed framework for a malaysian telemedicine system (mytel). 3. Agarwal, A. and Gupta, D., 2008. Security requirements elicitation using view points for online system, Emerging Trends in Engineering and Technology, 2008. ICETET 08. First International Conference on, pp. 1238–1243. 4. Aho, A. V., Lam, M. S., Sethi, R. and Ullman, J. D., 2006. Compilers:Principles, Techniques and Tools(2nd Edition), Addison-Wesley Longman Publishing Co. Inc. 5. Alenezi, M. and Alenezi, M., 2016. Open source web application security: A static analysis approach, IEEE pp. 1–5. 6. Alhazmi, O. H. and Malaiya, Y. L., 2005. Quantitative vulnerability assessment of systems software, 2005 Proceedings of the Annual Reliability and Maintainability Symposium. 7. Alzhrani, K., Ruddy, E. M., Boulty, T. E. and Chow, C. E., 2016. Automated big text security classification, IEEE Intelligence and Security Informatics pp. 1–6. 8. Arbaugh, W. A., Fithe, W. L. and McHugh, J., 2000. Windows of vulnerability: A case study analysis, pp. 52–59. 9. Aslam, T., Krsul, I. and Spafford, E., 1996a. Use of A Taxonomy of Security Faults, 19th National Information Systems Security Conference,1996. 10. Aslam, T., Krsul, I. and Spafford, E, H., 1996b. Use of a taxonomy of security faults, Proc. 19th Nat’l Information Systems Security Conf. 11. Avizienis, A., Laprie, J.-c., Brian, R. and Carl, L., 2004. Basic Concepts and Taxonomy of Dependable and Secure Computing, IEEE Transaction on Dependable and Secure Computing 1: 11–33. 12. Barua, A., Thomas, S. W. and Hassan, A. E., 2014. What are developers talking about? an analysis of topics and trends in stack overflow, Empirical Software Engineering 19(3): 619– 654. 13. Basin, D., Doser, J. and Lodderstedt, T., 2006. Model Driven Security: From UML Models 14. to Access Control Infrastructures, ACM Transactions on Software Engineering and Methodology (TOSEM) 15(1). 15. Bazaz, A. and Arthur, J. D., 2007. Towards A Taxonomy of Vulnerabilities, Proceedings of the 40th Hawaii International Conference on System Sciences, 2007. 16. Biggio, B., Fumera, G. and Roli, F., 2014. Security evaluation of pattern classifiers under attack, IEEE Trans. Knowl. Data Eng. 36: 984–996. 17. Bisbey, R. and Hollingworth, D., 1978. Protection Analysis: Final Report, Technical Report ISI/SR-78-13, Technical report, Information Sciences Institute, University of Southern California. 18. Bishop, M., 1999. Classifying Vulnerabilities. 19. Bollin, A., 2006. Crossing the borderline- from formal to semi formal specifications, in K. Sacha (ed.), IFIP of Software Engineering Techniques: Design For Quality, Vol. 227, Springer, pp. 73–84. 20. Bowen, G. A., 2006. Article grounded theory and sensitizing concepts. 21. Braz, F. A., Fernandez, E. B. and VanHilst, M., 2008. Eliciting Security Requirements through Misuse Activities, 19th International Conference on Database and Expert Systems Application, 2008. DEXA ’08., pp. 328–333. 22. Brian, K., 2015. Fbi: 1.2b lost to business email scams. 23. Burger, E. W., Goodman, M. D., Kampanakis, P. and Zhu, K. A., 2014. Taxonomy model for cyber threat intelligence information exchange technologies, Proceedings of the 2014 ACM Workshop on Information Sharing & Collaborative Security, WISCS ’14, ACM, New York, NY, USA, pp. 51–60. 24. Carl, E. L., Alan, R. B., John, P. M. and William, S. C., 1994. A Taxonomy of Computer Program Security Flaws, ACM Computer Survey 26(3): 211–254. 25. Chillarege, R., Bhandari, I. S., Chaar, J. K., Halliday, M. J., Moebus, D. S., Ray, B. K. and Wong, M.-Y., 1992. Orthogonal Defect Classification- A Concept for In-Process Measurements, IEEE Transaction on Software Engineering 18: 943–956. 26. Christophe, B., Val and rie, I., 1997. Security Benefits from Software Architecture. 27. Cobb, C., Sudar, S., Reiter, N., Anderson, R., Roesner, F. and Kohno, T., 2016. Computer security for data collection technologies, Proceedings of the Eighth International Conference on Information and Communication Technologies and Development, ACM pp. 1–11. 28. Common Vulnerabilities and Exposures:The Standard for Information Security Vulnerability Names, 2015. URL: https://cve.mitre.org/cve/index.htmlurl 29. Connolly, J., Davidson, M., Richard, M. and Skorupka, C., 2012. The trusted automated exchange of indicator information, MITRE Corporation pp. 1–20. 30. Creswell, J. W., Hanson, W. E., Plano, V. L. C. and Morales, A., 2007. Qualitative research designs selection and implementation, 35(2): 236–264. 31. Cui, X., Tan, X., Zhang, Y. and Xi, H., 2008. A Markov Game Theory-Based Risk Assessment Model for Network Information System, International Conference on Computer Science and Software Engineering, 2008., Vol. 3, pp. 1057–1061. 32. CVE Details: The Ultimate Security Vulnerability Datasource, 2015. URL: https://www.cvedetails.com/ 33. Desmet, L., Verbaeten, P., Joosen, W. and Piessens, F., 2008. Provable Protection against Web Application Vulnerabilities Related to Session Data Dependencies, IEEE Transactions on Software Engineering, 34(1): 50–64. 34. Dijkman, R., Dumas, M., Dongen, B. V., Klaarik, R. and Mendling, J., 2011. Similarity of business process models: Metrics and evaluation, p. 24. 35. Du, W. and Mathur, A., 1998. Categorization of Software Errors that led to Security Breaches, In Proceeding of the 21st National Information Systems Security Conferences (NISSC 98), p. 14 pp. 36. Eagle, S., Whalen, S., Howard, D. and Bishop, M., 2006. Tree approach to vulnerability classification, Technical Report CSE-2006-10, Technical report, Department of Computer Science, UC Davis. 37. Firesmith, D. G., 2005. A Taxonomy of Security-Related Requirements, International Workshop on High Assurance Systems (RHAS’05), Software Engineering Institute, Paris, France. 38. Giorgini, P., Massacci, F., Mylopoulos, J. and Zannone, N., 2005. Modeling Security Requirements through Ownership, Permission and Delegation, Requirements Engineering, 2005. Proceedings. 13th IEEE InternationalConference on, pp. 167–176. 39. Giunchiglia, F., Marchese, M. and Zaihrayeu, I., 2005. Towards a theory of formal classification, Proceedings of the AAAI 05 Workshop on Contexts and Ontologies: Theory, Practice and Applications. 40. Gomaa, H. and E.Shin, M., 2009. Separating Application and Security Concerns in Use Case Models, Proceedings of the 15th Workshop on Early Aspects, ACM, Charlottesville, Virginia, USA. 41. Goseva-Popstojanova, K. and Tyo, J., 2015. Security vulnerability profiles of mission critical software: Empirical analysis of security related bug reports. 42. Guan, H., n.d.. Security-driven software evolution using a model driven approach. Guidance on Testing Data Reliability, 2004. 43. Gupta, M., Walp, J. and Sharman, R., 2012. Threats, Countermeasures and Advances in Applied Information Security, IGI Global. 44. Haley, C., Laney, R., Moffett, J. and Nuseibeh, B., 2008. Security requirements engineering: A framework for representation and analysis, Software Engineering, IEEE Transactions on 34(1): 133–153. 45. Hansman, S. and Hunt, R., 2005. A taxonomy of network and computer attacks, Computers and Security 24(1): 31 – 43. 46. Hopcroft, J. E., Motwani, R. and Ullman, J. D., 2001. Introduction to Automata Theory, Languages and Computation. 47. Igure, V. M. and D.Williams, R., 2008. Taxonomies of Attacks and Vulnerabilities in Computer Systems, IEEE Communication Surveys & Tutorials 10(1): 14. 48. Isazadeh, A., Elgedawy, I., Karimpour, J. and Izadkhah, H., 2014. An analytical security model for existing software systems, Appl. Math. Inf. Sci. pp. 691–702. 49. Jajodia, S. and Noel, S., 2008. Topological vulnerability analysis: A powerful new approach for network attack prevention, detection, and response, Algorithms, Architectures and Information Systems Security pp. 285–305. 50. Jajodia, S., Noel, S., Kalapa, P., Albanese, M. and Williams., J., 2011. Cauldron missioncentric cyber situational awareness with defense in depth, MILCOM 2011 Military Communications Conference pp. 1339–1344. 51. Jiwnani, K. and Zelkowitz, M., 2002. Maintaining Software with a Security Perspective, Proceedings of International Conference on Software Maintenance, 2002., pp. 194–203. 52. Katrina, T., Brian, C. and Gary, M., 2005. Seven pernicious kingdoms: a taxonomy of software security errors, IEEE Security and Privacy 3(6): 81–84. 53. Khan, R., McLaughlin, K., Laverty, D. and Sezer, S., 2018. STRIDE-based Threat Modeling for Cyber-Physical Systems, IEEE. 54. Khwaja, A. A. and Urban, J. E., 2002. A Synthesis of Evaluation Criteria for Software Specifications and Specification Techniques, International Journal of Software Engineering and Knowledge Engineering 12(5). 55. Kim, Z., 2014. Countdown to Zero Day: Stuxnet and the Launch of the World First Digital Weapon, Crown Publisher. 56. Lamsweerde, A. v., Brohez, S., Landtsheer, R. D. and Janssens, D., 2003. From System Goals to Intruder Anti-Goals: Attack Generation and Resolution for Security Requirements Engineering, Requirements for High Assurance Systems (RHAS’03), pp. 49–56. 57. Last, D., 2015. Using historical sw vulnerability data to forecast future vulnerabilities, IEEExplore. 58. Leitner, M. and Rinderle-Ma, S., 2014. A systematic review on security in process-aware information system, constitution, challenges and future directions, Information and Software Technology 56(3): 273–293. 59. Linde, R. R., 1975. Operating System Penetration, National Computer Conference, System Development Corporation, p. 8 pp. 60. Lough, D. L., 2001. Thesis: A Taxonomy of Computer Attacks with Application to Wireless Network, Technical report, Virginia Polythechnic Institute. 61. Loveland, G. and Lobel, M., 2012. Cybersecurity: The new business priority. 62. Lowis, L. and Accorsi, R., 2011. Vulnerability analysis in soa based business processes, Services Computing, IEEE Transaction on 4(3): 230–242. 63. Luo, J., Lo, K. and Qu, H., 2014. A software vulnerability rating approach based on the vulnerability database, 2014: 1–9. 64. Maatta, J., Harkonen, J., Jokinen, T., Mottonen, M., Belt, P., Muhos, M. and Haapasalo, H., 2009. Managing Testing Activities in Telecommunications: A Case Study, J. Eng. Technol. Manage. 26: 24. 65. Mellado, D., Blanco, C., Sà ˛anchez, L. E. and Fernà ˛andez-Medina, E., 2010. A Systematic Review of Security Requirements Engineering, Computer Standards & Interfaces 32(4): 153–165. 66. Memon, A. M., 2001. A comprehensive framework for testing graphical user interfaces. AAI3026063. 67. Merriam, S., 1988. Case study research in education: A qualitative approach. 68. Moghbel, Z. and Modiri, N., 2011. A framework for identifying software vulnerabilities within sdlc phases, International Journal of Computer Science and Information Security 9(6): 203. 69. Munson, J. C., Nikora, A. P. and Sherif, J. S., 2006. Software Faults: A Quantifiable Definition, Advances in Engineering Software 37(5): 327–333. 70. Nadeem, M., Edward, B. and B.J., W., 2015. A method for recommending computer-security training for software developers, 12th Int Conf on Information Technology âA¸S New Gener- ˘ ations. 71. Neuhaus, S. and Zimmermann, T., 2010. Security trend analysis with cve topic models, Software Reliability Engineering (ISSRE), 2010 IEEE 21st International Symposium on, pp. 111–120. 72. NIST Interagency Report, 2015. URL: NIST Interagency Report at http://csrc.nist.gov/ publications/ nistir/ ir7298- rev1/nistir-7298-revision1.pdf 73. Ostkamp, M., Kray, C. and Bauer, G., 2015. Towards a privacy threat model for public displays, Proceedings of the 7th ACM SIGCHI Symposium on Engineering Interactive Computing Systems, EICS 2015, ACM, New York, NY, USA, pp. 286–291. 74. Papp, D., Ma, Z. and Buttyan, L., 2015. Embedded systems security: Threats, vulnerabilities, and attack taxonomy, 2015 13th Annual Conference on Privacy, Security and Trust, PST 2015, Institute of Electrical and Electronics Engineers Inc., pp. 145–152. 75. Pestacore, J., 2003. Taxonomy of software vulnerbailities. 76. Petersen, W., 2006. Introduction to the theory of formal languages. 77. Ponemon, 2014. Corporate data: A protected asset or a ticking time bomb. 78. Potter, B. and McGraw, G., 2004. Software Security Testing, Security & Privacy, IEEE 2(5): 81–85. 79. Rebolloa, O., Melladob, D., Fernandez-Medinac, E. and Mouratidis, H., 2015. Empirical evaluation of a cloud computing information security governance, Information and Software Technology 58(3): 44–57. 80. Rohse, M., 2003. Vulnerability naming schemes and description languages:CVE, Bugtraq, AVDL and VulnXML, SANS Institute. 81. Ruohonen, J., Hyrynsalmi, S., Rauti, S. and Leppänen, V., 2017. Mining social networks of open source cve coordination, Proceedings of the 27th International Workshop on Software Measurement and 12th International Conference on Software Process and Product Measurement, IWSM Mensura ’17, ACM, New York, NY, USA, pp. 176–188. 82. S, W., P, A, K. and A, P., 2005. A software flaw taxonomy: Aiming tools at security, Proc. Workshop Software Eng. for Secure Systems. 83. Saunders, M., Lewis, P. and Thornhill, A., 2009. Research methods for business students,5th Edition, Pearson Publication. 84. Schipper, A., Fuhrmann, H. and Hanxleden, R. v., 2009. Visual comparison of graphical models, 10: 335–340. 85. Seacord, R. C. and Householder, A. D., 2005. A Structured Approach to Classifying Security Vulnerabilities, CMU/SEI-2005-TN-003, Technical report, Carnegie Mellon, Software Engineering Institute. 86. Shaikh, R. and Sasikumar, M., 2015. Data classification for achieving security in cloud computing, Procedia Computer Science 45: 493 – 498. International Conference on Advanced Computing Technologies and Applications (ICACTA). 87. Singh, U. K. and Joshi, C., 2016. Network security risk level estimation tool for information security measure, 2016 IEEE 7th Power India International Conference (PIICON), pp. 1–6. 88. Soiferman, L., 2010. Compare and contrast inductive and deductive research approaches. 89. Soo Hoo, K. J., 2000. How much is enough? a risk management approach to computer security. 90. Stephan, M. and Cordy, J. R., 2011. A survey of methods and applications of model comparison. 91. Symantec, 2014. 2014 Internet Security Threat Report by Symantec Corporation. 92. Tripathi, A. and Singh, U., 2011. Taxonomic analysis of classification schemes in vulnerability databases, Computer Sciences and Convergence Information Technology (ICCIT), 2011 6th International Conference on, pp. 686–691. 93. van Lamsweerde, A., Darimont, R. and Massonet, P., 1995. Goal-Directed Elaboration of Requirments for a Meeting Scheduler: Problems and Lessons Learnt, Proceedings of the Second IEEE International Symposium on Requirements Engineering, 1995., pp. 194–203. 94. Villarroel, R., Fernà ˛andez-Medina, E. and Piattini, M., 2005. Secure Information Systems Development - a Survey and Comparison, Computers & Security 24(4): 308–321. 95. Vlas, R. and Robinson, W., 2011. A rule-based natural language technique for requirements discovery and classification in open-source software development projects, System Sciences HICSS, 2011 44th Hawaii International Conference on, pp. 1–10. 96. Walia, G. S. and Carver, J. C., 2009. A Systematic Literature Review to Identify and Classify 97. oftware Requirement Errors, Information and Software Technology 51(7): 1087–1109. 98. Weissman, C., 1973. System Security Analysis Certification Methodology and Results, Technical Report SP-3728, Technical report, System Development Corporation. 99. Weissman, C., 1995. Penetration Testing, IEEE Computer Society Press 11: 269–296. 100. Yin, R., 1989. Case study research design and methods. 101. Yu, E. and Liu, L., 2000. Modelling Trust in the I Strategic Actors Framework, Proceedings of the 3rd Workshop on Deception, Fraud and Trust inAgent Societies, Barcelona, Spain. 102. Zahedi, M., Babar, M. A. and Treude, C., 2018. An empirical study of security issues posted in open source projects, Proceedings of the 51st Hawaii International Conference on System Sciences, pp. 5504–5513.